Capturing video for the Looking Glass display


I’m eager to try out some of this myself, there’s just one step I’m missing - once you have created your “quilt” image, how are you loading that to the Looking Glass to display as an interlaced image?



Hey Eric,
Sorry – missed your message earlier. We have a command-line tool called Quiltr (public release coming soon, pm me if you have a Looking Glass now and want to experiment) that can display quilt images, image sequences and videos in a Looking Glass. I also wrote a basic graphical user interface program for doing the same thing, but more user-friendly. We’ll be refining and releasing that over the next month or two, but let me know if you’re interested in beta testing an early version.



Hi Alex,

I would love to experiment with the beta version. I have a few techniques for creating 3-D image sequences that I’m eager to test on the display.



Hi there,

would love to test an early version of the command line tool too if possible.



Hi Alex,

I am also interested in experimenting with the Quiltr tool on my Looking Glass. Looking forward to try to create some holographic visuals with Blender.



Apologies to all for the silence, but here’s a small update on my progress.

Like many others I have now received my Looking Glass display, and boy is it cool! My only regret is now that I didn’t get the large one :smile:

I’m of course way behind my original schedule with the camera build, mostly due to me underestimating the complexity of all the parameters that goes into this, real life getting in the way, and other terribly bad excuses as well. But I have decided that I would rather enjoy the journey and try to make it as good and enjoyable as possible rather than just finishing as fast as I can.

I have quite a few design decisions that I feel like I’m not entirely sure about, like the number of camera modules, the type, and other parameters as well. I’m also still not entirely sure that the cheap analogue AHD CCTV camera module setup is the right way to go, perhaps a bunch of Raspberry Pi’s with Pi Cams would be better, the quality would surely be a lot higher, but so would the price.

So to try and take the guesswork out of the equation I decided to build a programmable camera slider, just like the one Evan showed us in his video posted in one of the kickstarter updates, so I can actually test out the various parameters and camera modules.

My camera slider is almost done, I just need to do some simple arduino programming, and then I can start to make my own lightfield photographs and hopefully from that make some design decisions for the video setup.


Hi! I learned steroscopic 3D from an industry expert, and I agree that a linear array is better than an arc. You do not want vertical disparity between views, which an arc will make. To get the same convergence like an arc, you offset the backplane instead. If you don’t have the ability to do that (like a view camera), overcapture horizontally, and then crop and offset the images. So your leftmost camera uses the right hand side of the captured image (and crops the left) while your rightmost camera uses the left hand side of the image (and crops the right). In-between cameras slide between these extremes. This places your object of interest at zero depth (or near).

1 Like

Your picture wakes my memory up. The tool was from 2013.

I have to say both ways are good. It depends on your limitation and cost.
We used all of them into a layout.
To make a comfortable and strong 3d effect is the final target.

The real multi-camera system…
I recommend the linear array. It is more stable.

1 Like

I saw the film is playing. It is not a single image. The looking glass can play film? That is surprise me.


Whoa, this is awesome!

Watch this space for more info on lightfield photo / quilt generation tools, we have some new stuff coming very soon.


Hey Alex,
I’ve got a looking glass and I’m looking to experiment with Quiltr now.


Hi Alex – I’d love to see what some of my stereo videos and stills look like on my large Looking Glass. I saw Dez wasn’t happy with the result (Stereoscopic 3D videos on the looking glass?) but I’d like try for myself. Could you point me to your Quiltr ?



Hi Alex

Would love to test Quiltr, where can i find it?



Would love to test Quiltr, where can i find it?

+1 on this


Hi all, unfortunately we’ve stopped production on Quiltr. As an alternative, we have a quilt viewing app and a lightfield photo stitching app (for now both Windows only) being released as closed betas this week. Sign up for those here!

These are both heavier to run than Quiltr was and do not use command line interface, instead opting for a GUI-based approach. We’ll be releasing a Python-based command line interface tool for photo stitching photos in the coming months!


We found/hacked a way to get AVPro Video to draw to the Quilt texture meaning video could be played into the correct format. It’s hard-coded at the moment (no video selector interface) as it’s was made for me to pre-render complicated visuals into the looking glass.
I could dig it up if it were useful to anyone ?


This Quilt Viewer app that is currently in closed beta has similar functionality using AVPro, I’d be happy to send you that as well if you sign up @Renbry!


I’m also interested in software to make this happen. I wasn’t able to locate documentation for the algorithms for the interlacing. I figured it could be inferred from the code in the pixel shader for the Unity camera, but if something is already available for this, that’d be nice…

Also, does anyone know of any ML toolkits which do a reasonably good job of inferring intermediary frames? All of my searching in this area just turns up photogrammetry libs, which seems like a lot of overkill here…


Hi @alxdncn , I’d love to try that out. We do VFX/3D rendering so I have lots of ideas about doing cool lightfield render-to-videos. Thanks : )


If you sign up for the beta, you can do it now. You don’t render the lenticular image. Instead, you render a quilt video and the Quilt Viewer will play it.

1 Like