Creating Looking Glass video from Depthgate then Stereo Photomaker software


. post flagged as spam

1 Like

. post flagged as spam


My Avengers Endgame 3d FULL trailer 1 Russian The Looking Glass Hologram

Used Triaxes Depthgate with HSBS input to 2d+Depthmap output then
the 2d+Z image sequence to 45 Quilt image sequence using
Stereo PhotoMaker.



is the deviation too high i can redo with lower deviation?


My James Cameron discusses Avatar The Looking Glass hologram
My Endgame sample above used maximum 300 deviation in Stereo PhotoMaker
this one uses 200 deviation


Another test My Avengers Endgame 3D Trailer 2 The Looking Glass hologram
HSBS Trailer converted to 2d+Z depthmap by Triaxes Depthgate then
2d+Z image sequence converted to 45 view 9x5 quilt image sequence
by Stereo Photo Maker using Looking Glass deviation 200


Triaxes Depthgate software support told me they’ll discuss with their looking glass software development team to consider modifying Depthgate to convert Side by Side 3d video directly in to Quilt Videos in better quality than from depthmap directly in the future


There is a serious bug in All Looking Glass Quilt Video Players currently

i converted an imax trailer from depthgate to 45 tile quilt and found that the looking glass quilt video players don’t like the 1st second of a video to be total black screen the video player assumes it’s not a quilt video for some reason maybe you can make your own looking glass player or help them improve their own?


Hmm, that’s certainly an issue. We have a few solutions for video playing. One is our Quilt Viewing app, which you can receive by signing up for our closed beta release. It’s still in development so we don’t want to make it fully public yet, but we’ll send it your way if you fill out the form!

The other option is a really wonderful tool but together by Masuji from our community. You can check that out here!

Please let us know if you have any issues with these solutions!


Tried Masuji’s player not playing my long quilt trailer videos for me and played incorrectly one


Yes it seems to be assuming that a 4K quilt will always be 5x9 in terms of views and 2K will always be 4x8. One of yours breaks that convention, which is totally fine but that player can’t account for it. The Quilt Viewer will handle it fine though, we’re sending out that build tomorrow if you signed up on the form!


My Star Wars The Rise of Skywalker (2019) Eng Dub 3d trailer Looking Glass Hologram Quilt 9x5

1 Like

I don’t understand why every people post a “spritesheet-video” instead of the final processed video.

It’s much better for the size (in bytes) , the quality (you can provide the best quality possible) ,performance (the client just need a video player, no huge computation is involved here) and it’s possible to read it directly from a video player.


If I’m understanding you correctly and you’re referring to a video that already has the subpixel scattering algorithm applied, that wouldn’t be visible on other people’s Looking Glasses. Each display has slight variations that make big differences in the end visuals, so trying to play a video processed for someone else’s Looking Glass will not work.

1 Like

What @alxdncn said – also, even if calibration were not an issue, the raw pixel values would not compress with H.264 very well at all, and any compression artifacts would look terrible on the display. The quilt is much better in both regards (until such time a lightfield standard appears that makes things better).

1 Like

Hello !

@Dithermaster : your explanation was relevant but I wanted to be sure about it and I produces some videos.
I first tryed with a “lossless” quality , 11 seconds of video produce a file of 2Go : it works perfectly.

Then I tryed in “high quality” , 11 seconds of video produce a file of 90 Mo : it works perfectly (a little bit blurry but it still looks great)

You can try it here (it needs to be in fullscreen) :

Then I tryed with a “normal quality”, 11 seconds of video produce a file of 36 Mo : the image is blurry but the hologram works perfectly .

You can try it here :

(I used my implementation in typescript of Holoplay.js and my shader is a bit different than yours but I suppose it also works with your shader)

(and I used OBS Studio - a freeware that a allow to make a video-capture of a webpage - )

EDIT : can some people test these videos and tell me if it works for you or not - please -

Is it possible to propose a video instead of a demo for "the library"?

“The quilt is much better in both regards”

Not if you want to show it to your friend with a regular laptop (without powerful graphic card).

I read in another post that LookingGlassFactory should release a “case” plugged behind the LG. The case could contain just an hard-drive, a video player and a very cheap configuration.

I think the fact it’s runnable as a classic video file is a key-feature of the LG actually, it makes things much more easier to use and it cost much less money to host (on a webserver) a 2560x1600 video than a 8192x8192 video and the first one will run smoothly at 60 fps on almost every computer while the big video may not work very well on some computers…

EDIT : the next weekend, I will create a proof-of-concept with a kind of video-game made in 2D where every image is a png containing the holographic image. Do you imagine how much ressource I will save if I do that ? Every thing become possible because the LG convert a 2D picture into a 3D picture, then every 2D game could be updated to 3D just by updating the textures, and that’s all…

This solution is infinitly more flexible than the quilt in my opinion

@GilesGoat : it may interest you :wink:


Well, I made some test and I’m a bit disapointed…

It works almost well with static images ( I’m speaking about a composition made from different processed-image ; it’s perfect with a single one)

You can try it here (clic to go fullscreen)

But the result are funny but unexpected when you try to move it (clic to go fullscreen)

It may be usefull in some particular cases but it’s not as easy-to-use than what I though … :’(

EDIT : I must precise that I modify the viewCone in order to get a very wide parallax for each image (the viewcone is not the same for each images)


Nice. For that specific use case (single device playback, no high-end GPU) it makes sense, as long as the codec doesn’t break things. PNG sequences would be perfect.


@Dithermaster : How do you explain that my " holographic-particles-png" are read as a movie if I try to move it ?

Its very weird because even if I increase the x or the y to 0.01 pixel by frame, it’s still “considerer” my png as movie , it’s just “rotating” much more slowly

EDIT : Concerning the scale, I observed that I could take a static image and it’s possible to rescale it without deformation only if the scale is a “simple ratio” , for example

1/2 , 1/3 , 1/4 , 1/5 , 1/6 , 1/7 , 1/8

works almost great (it’s not “perfectly-perfect” but it’s almost perfect )

but it doesn’t work with other values