Maya to a Looking Glass


Maya SDK

This thread will document my dev notes and discussion about our Maya to Looking Glass pipeline.

Have a look at this flashy T-Rex rendered this morning using the Maya native renderer and running in the Looking Glass:


Some important features and considerations have been built into this tool kit, including setting up the cameras in a way that ensures the main focal plane of the scene is perfectly controllable and predictable. We’re also including features that assist with depth-of-field effects to give another level of control to the artist, specifically around elements in the scene in the background and foreground.

Post processing effects will also be possible to add to renders made with Maya and other content creation software thanks to external pipeline tools that are also in the works. As far as workflow, each of the views of the supersterosopic scene needed for the Looking Glass will be treated as a separate sequence, which can be post processed and then reassembled as desired to be displayed, for instance, as a superstereoscopic group-viewable 3D movie in the Looking Glass!

Using custom renderers is also already possible, due to a customizable render command line. Here’s a piece by Ash, a 3D artist who we recently were lucky enough to cross paths with, using the Looking Glass tools together with Arnold renderer:


Coming up next: a realtime viewport working in the Looking Glass! Completing this means you could preview an animation you are working on directly in the Looking Glass as you work on it! We’ll post more about the live viewport from Maya once it is fully operational.


Public download link coming soon :slight_smile:


Hello, I can’t find any ways to play static images or videos like the renders you show. Can you advise on any opportunity I can have to pre-render material for my Looking Glass ?

Thanks !

Matt Hermans
Electric Lens Co.


Hi @Renbry!

Yes, I happen to be working on both these things. Neither the viewer or the Maya SDK is in a 100% finished state but both are useable.

Both are at the moment, still unreleased.

We are currently working on legal/licensing issues on them before releasing them. It appears some changes need to be made to the softwares before they are legally compliant and we can release them.
Please stay tuned.


Hi Dez, thanks for the reply!

It looks like I can bypass the Unity Capture code and feed it a pre-rendered texture via avpro video. Have you got any experience with that ?


Matt Hermans


Yes, but you still need the video in ‘quilt’ format, which is all the views together like this:

The tools I am working on should help a lot with that.
issues that the tools help you overcome are:

  1. setting up the cameras correctly so with all the views you still have control of what is in the main focal plane (the plane where all the camera views cross).
  2. Proper ‘film offset’ on the camers: The cameras all must be parallel.
  3. The tools manage all the rendered views from all the frames and can be used to create the quilts for you.

There are ways to do this yourself, of course. It’s just a bit of work, is all.



Hi Dez,
Yep I found that out.

Do you know if it wants a Parallel Rig or Converged type Camera rig ?
I’m synthesising the views (in Houdini/3D rendering) but I can build a horizontal array or converged type.

I’ll give them a go now.


The cameras must be parallel.
Use the horizontal film offset to create the different views.


Hmm, AVPro only seems to write into Material and we need direct access to a Texture so I’ll have to figure out the code to hack this in.


Got it working with a custom script to link the secret AVPro video into HoloPlay Capture. Works great ! Except the video is upside down; but that’s not a big deal now that I have video play back!



If you have time would you PM me the snippet you used to link the quilt movie to AVPro, that is actually on my task list for later!