New Interactive Lightfield Prototype!


"But Charlie, don’t forget what happened to the man who suddenly got everything he always wanted… He lived happily ever after.” -Willy Wonka

Hi everyone!

tl;dr: 3D interaction directly with floating 3D content is actually possible without VR or AR headgear!!! GIF below! Want to buy one of these interactive lightfield dev kits NOW to experiment with? Email me at!

What if you could reach into the 3D scenes of a Volume, directly? Directly touching and transforming 3D content without need of head-mounted displays is a core tenet of the dream of the hologram to me. Now, hot damn, it’s happening! Check this out (no CGI or tricks here):

That’s a superstereoscopic (22 views that are tight enough to be viewed in glorious glasses-free 3D in a 50-degree view cone) aerial display (it’s floating over the glass surface shown above) with the views being updated dynamically off of my laptop. My finger moving the X-wing is being tracked by an infrared curtain (or, in a version we just got working this week, being tracked fully in three dimensions with the SR300 Intel Realsense camera).

The multiple views are made with a slanted lenticular on a 2048px x 1538px screen (a brilliantly simple trick invented by a couple researchers at Philips 21 years ago), with content being generated in realtime with a new Unity SDK that Kyle of Valley Racer fame forged into existence last month. By pulling the main image plane into free space with an optical circuit consisting of a retroreflecting array and a few reflective polarizers and quarterwave plate films, interaction with and around the floating 3D content is unlocked! And because the 3D scene is generated in real-time, that makes direct interaction with the 3D scenes possible.

This is going to sound super nerdy, but the thing I’m really stoked about is that this interface’s coordinate system is both directly interactive AND viewer independent. If I touch the back thruster of the floating X-wing for instance, someone sitting next to me sees my finger coincident with that thruster’s same coordinates in real meat-space, just like I do. We achieved that viewer-independent coordinate system in Volume by actually scattering points of light off of physical surfaces, which has the unfortunate byproduct of requiring those physical scattering surfaces (which tend to get in the way of meaty fingers). This system sidesteps that problem.

I’d actually read about advanced lightfield interfaces somewhat like this being shown around conferences and in university labs for a couple decades, but they were always cloaked in mystery – they were super expensive to make, with little content and no content creation tools, with limited or mostly no interaction – and never available for sale as a kit or system.

Boo. Time to change that! If you’re interested in getting one of these first handmade interactive lightfield dev kits, write me at We’re going to be selling the first few of these prototypes at $500 (that’s basically our cost right now + a pizza) and that’ll include a built-in Realsense SR300 for full 3D interactive fun. And yes indeed!, we’ll post a github link to a beta version of the Unity SDK and Realsense SDK by Kyle and Dez that handle all of the image and interaction processing here shortly.

(For the deeply curious: here’s one of the calibration scenes that we’re using on these first prototype systems. Notice how you can glance around that test cylinder and occlude the small white cube).

full disclosure: while the core fundamentals of this system were pioneered decades ago and are out of patent, Looking Glass is patenting some improvements we’ve made over the past few months that improve contrast and resolution, add good 3D interaction, improve the number of views/viewing axes and view cone, and ultimately shrink the entire system down. I’m hoping this allows this class of system to finally escape the confines of the lab and venture into the wider world. In a related note re: the SDKs we’re writing for this system, we’re debating now whether it’s a good idea to opensource those tools and welcome debate on the subject here. If you have any questions about what we’re patenting and why (and whether that’s a good idea for the community or not), please email me or post your thoughts in the forum!

Experiments Designing Interactive Physical HolObjects

A short video we made of the earliest dev kit prototypes being shown at Maker Faire Bay Area just two weeks ago:


Friday afternoon in Brooklyn has us like :astonished:
Check out 2 new apps Oliver whipped up that are super depthy (esp IRL), interactive, & a little wacky.

…or Psychedelic MeatBallz (name TBD)


Well, what do you think?!


These are so cool! Can’t wait to check out the system in a couple weeks.


We had Toshi and Liza from IFTF at the Hong Kong lab yesterday, a day which also coincided with us getting the RealSense interaction working with the latest system. Credit goes to visual wizard Oliver for the app itself (unnamed but the temporary title is “You’ve Made a Sombrero!”)

Doesn’t this remind you a little of Dr Seuss’s Oh, the Places You’ll Go!?

Here’s another quick pan!


Hello! Ben here. I am sharing GIFs of apps-games in development for LG technology which were painstakingly made on an artistic front. As well in general. The gifs are side by side of a phone HoloPlayer capture in best optimal lighting available and the way the game appears (for the most part) during development on PC screen.

NOTE: I am using multiple posts as this forum denotes ‘new users can only post one image at a time’.

“Marshall’s Theory”, A Survival Horror Game, 3D Visuals Sculpted, Rigged, Animated, Coded


“Prayer Cog”, A 2.5-D Platformer Sketch, 3D Visuals Sculpted, Rigged, Animated, Coded


“HoloDancer”, A Visual Showcase for HoloPlayer One, 3D Visuals Sculpted, Rigged, Animated, Coded

Works, and additional beyond, will continue through hardware development, kinks getting sorted out, show-meetup-festival showing, and/or other hi-jinks.



Just saw these sitting in a pile under a bench in the lab today – a small sample of the dozens of physical prototypes made over the past few years leading the way to this new dev kit. All hail the power of the laser cutter!


Some experiments with filming this crazy thing. Added an etched grid pattern on top of some drawings made with 3D painting app by @oliver.


I loved TiltBrush, but I love @oliver’s glorious 3D Draw app more. I lost an hour playing around with it last week.


HoloBrush + Nintendo Switch Joy-Con hack?


But maybe you gotta see it to believe it so come find us through the rabbit hole this weekend at MakerFaire NY! Will update with exact location after we move in tomorrow :slight_smile:


hi,We are interested in your products. Can you tell us about the patent of this light field product or where the paper can be seen?


Hi Steve,

HoloPlayer One and related interactive lightfield displays made in the Looking Glass lab have a number of patents granted and pending. That said, we’re very open to anyone else chasing the dream of non-headgear holograms coming by our labs in Brooklyn or Hong Kong. So, send me a note and come on by!

Also should say, this isn’t just R&D - we make and ship these things. The first run of HoloPlayer One dev kits are available now for a few more days at

I’m curious about what applications you have in mind. Send me a note or post here, and I can let you know more relevant info re: the Holoplayer itself, or related technologies that are in the works!


ok ,thank you sir ,and another more question ,the basic optics of the light field display in your products is based on multi focus plane or multi view like super multi view ?


HoloPlayer One is based on a 32-view superstereoscopic display we make in the Looking Glass lab (using a custom lenticular or fly-eye lens laminated onto a 2K or 4K LCD or OLED screen), then re-imaged (aka made to float) above the device using a retroreflector + beamsplitter optical circuit, and then overlaid with an interaction layer using the Realsense SR300. All tied together by the HoloplaySDK.