Tool for Making Holograms from Video Pans (that work on a Raspberry Pi!)


Hi folks,

I just got my looking glass yesterday and was looking to make a holographic photo album to put in our entryway. Naturally, I needed to capture holograms before I can show them and I’d like to drive the display off of a Raspberry Pi Zero tucked behind my looking glass. So that poses two questions:

  1. Can I make a tool to create a “Quilt” from a video pan I captured on a phone (or other rig)?
  2. Can I precompute what the fancy shaders would otherwise do and save that as an image so the Raspberry Pi doesn’t need to do any work other than show a PNG?

Well, the answer to both is yes! May I present my janky, hacked together “oven” for baking video pans into precomputed holograms:

Here’s a demo created from walking across my living room:


Whoops forgot the video and the forum isn’t letting me edit things because “the body is to similar” :neutral_face:



(Also, I’ve yet to actually load these PNGs onto a RPi, but that’ll come tomorrow)

1 Like

Sweet! Nice work! It will be interesting to see if the Pi Zero can drive it. There was some discussion about whether it could (and one comment was that it could, but with the shaders disabled, but that doesn’t matter for your use case!). Please update with your results.

1 Like

A Pi Zero (W) can indeed drive it! Setup:

  • Raspbian Stretch Lite (no desktop)
  • Use the config.txt settings from here
  • Setup wpa_supplicant.conf for headless administration.
  • Turn things on, SSH in (ssh pi@raspberrypi if you have only one RPi on your network).
  • Uplug HDMI and plug it back in (I needed this for things to detect properly).
  • Get the fbi utility to load an image directly to the framebuffer via sudo apt-get update && sudo apt-get install fbi
  • Run sudo fbi -T 1 [baked_image].png and wait a few seconds.
  • Enjoy your hologram!

Next up – when I get to it – will likely be some sort of python script to cycle through a few images in the framebuffer and perhaps use the controls on the bottom to shift through things.

Things seem to run reasonably cool as well, the output of /opt/vc/bin/vcgencmd measure_temp is around 50C.


Very nice, and thanks for the details! I are you using the small or large LG display? Is it true the GPU is disabled at resolutions above HD? It would be nice to play videos and have the pixel shader run to turn device-independent quilts into device-specific bitmaps.

1 Like

I’m using a small LG. It’s not super obvious to me if GPU acceleration is enabled or disabled (particularly since I’m not running X at the moment). Now that I’ve built a Linux holoplay.js client, I suppose I should try booting a full desktop and seeing if that stuff works.

1 Like

Alright, tried with a full Raspbian Desktop.

Good news: WebGL works! Albeit unbearably slowly


Bad news: I tried a Holoplay.js demo with my Linux Holoplay.js server (which works) and things would work, except the Holoplay.js SDK expects to be able to fetch config data in under 800ms which is way faster than it actually works. I could fork a modified version of Holoplay.js that doesn’t have the timeout, but I’ve done enough fiddling on this today :slight_smile:

My intuition says that if we did straight EGL we could probably render a basic scene with the appropriate shaders at a modest framerate.