What makes it 'switch' into 3D mode?


This is something I don’t quite understand …

The display is a 2560 x 1600 display … ‘as far as I can see’ it’s a display …

However even if I cerate ‘a quilt’ my own and I send it to it … it’s just shows it ‘as a quilt’, no 3D effect whatsoever.

Does it need some ‘special USB command’ to get told ‘activate 3D mode’ ?



Just showing a quilt on it won’t work. The image has to be processed to get the 3D effect. That’s done by using an app (like Lightfield Photo or Quilt Viewer), Unity with the HoloPlay SDK, or three.js.


I think I got that by looking at what the output of the Holoplay is on a standard display.

Look not pulling your leg here but I get the impression for whatever reason I still don’t know the HoloplayAPI.dll fails preceisely in the loading/creation of the shaders, if I call separately one by one this functions :


At the call of hp_loadLightfieldShaders(); I get an error with a NULL ptr 0x0000000000000000, it's like for whatever reason it's not loading the shaders.

I tried to debug the DLL with no sucess, I got the point I can see the shaders you are using I was wondering if I could 'extrapolate' ( take them out ) and re-compile it separately in my envirronement and see if I can make them work.

I using WIN10 64 bits and the correct libs/dll but so far I am getting nowhere, void hp_initialize(), seems 'not to do do much' anything else seems not to do anything but there's no way to know if any error occoured.

I repeat it needs some CPP source code, basic, "all worked out" that one can compile and see it working, as it is for whatever reasons I can't make it work yet.


Just to let you know, in despair I am studying your .JS code becuase that’s the only thing that I can see “does something in some way” and it’s at least someway comprehenshible.

I’ll see if I can make a working .CPP code from that.


Hey there! Sorry we weren’t clear enough in the pinned forum post but those “Quilts” that you see being shared are actually viewable in one of our closed beta Quilt Viewer applications. You can sign up for that closed beta here and someone from out dev team will get you a beta build within 1-2 days. If not, leave a message here and we’ll get it sent over!

1 Like

maybe I have not been very clear, I am integrating your looking now ( now we actually own two ) into our “Neon engine” which is our proprietary CPP code based game engine we been using since the last 10 years or so.

I do not want/need an app/tool to view quilts, I have to be able to generate my own ones and send it to the Looking Glass device, or put it in another way, I need to be able to write my own “Quilt Viewer”.

Now my understanding after various analysis of the code is that the quilt is a sort of ‘itermediate representation’ you use to feed up a certain Fragment Shader that you fill up also with some “calibration parameters” that someway ( via tcp/ip localhost 11222 ?? ) you attempt ro read from the Looking Glass or fail that you fallback to some “factory defined standards”.

What I would need is more “a complete C source of your Quilt Viewer” and/or a CPP sample code that shows precisely the steps needed to be done, your documentation at the moment is VERY MINIMAL and not sufficient IMHO to make your .DLL API work.

It does not say for example “in what format/res it wants the textures”, it does not say quite a few critical information needed to write a working code.

Said that “sounds also a bit odd to me” that “you have a DLL but you don’t have ( yet ) a sample cpp code to go with it”.

Now the thing is, to make it all works I am literally reverse engineering your Looking Glass in the simple effort to say “ok let’s see what it takes to make it work”, which is “bad” becuase I really expect all this coming from you not me trying to work out “what have you done” by myself.

Unfortunately at the moment there’s nor enough documentation, nor any single workable sample so “I am going with what I have” hoping that soon we’ll see better documentation coming out.

Said that, given that SOMEONE must have written a working viewer SOMEONE must know/have the info to make it work, so I hope you are going to share this with developers, at very least let me know if you have any NDA in mind.

I don’t want “new tools” I just want the knowledge “to make my own tools”, this “Looking Glass” I am going to integrate it into “a bit of a bigger world” and I know “it should not be talking too much” to make it work properly.

So if you tell me “there’s somewhere I can join to get more info about all this” I’d be happy to but what I need is information/sources/working code sample, not a new “black box tool” to play with, I don’t need that.


1 Like

Correct, quilts are just a device-independent multi-view representation. The actual pixels rendered require the device calibration values and a full-screen pixel shader that maps RGB subpixels to a view. I’ve written one in C++, and later discovered there is a working shader right in the SDK. Check it out. The calibration parameters are required; there is no “factory default” as every device is slightly different. The calibration values are read over USB HID, but I understand Looking Glass has a TCP localhost server that you can use to get them.
Your best bet is to study the SDK or better yet get the beta OpenGL SDK and you’ll learn everything you need to render native images for the display (that’s what my stand-alone C++ program does, although not at full framerate). Note that the two displays are different resolutions, but the shader can be resolution independent.


I tried to used the supplied DLL, the OpenGL SDK has not really much ( or am I missing something here ? ) all I have is the HoloPlayApi-0.0.5 DLL , the license and a .h and nothing else, there’s not a single code sample of anything and all the docs I have starts here : http://docs.lookingglassfactory.com/HoloPlayCAPI/ and there’s nothing much else.

It does not show you exactly nor how to use the DLL properly, I did try and I kinda get what you are supposed to but ‘something does not work’ and I can’t figure it out what given that there’s not a single bit of code showing how to use that DLL.

“What SDK to study” are you talking about because there’s really not much else other than that link supplied as “documentation”, the best I can only see is the .js source of the HoloPlay.

How you managed to make your own shader and such given that there’s no info about how to do that ?

Care to share some of that code/info or at least can you tell me what I am missing to study or see that I cannot see anywhere ?

When I joined the “OpenGL Beta” all I got was that link to “docs” and the DLL to dowload and nothing else, that’s not much.



Sorry, I was working from memory. Looking at my machine I use with the Looking Glass, I can give more detail: Use the Looking Glass Unity SDK to create a project, then look in that project folder for “Lenticular.shader”. You’ll see how RGB subpixels get turned into a view number from the quilt and looked up. After I had reverse engineered it myself, I was happy to see the factory shader was remarkably close to my own. The harder part will be getting the calibration data. I reversed-engineered mine, but the correct way to to get it from the device, directly or through a localhost server (I don’t know much about either).


you see, that’s the point, we are both reverse engineering things for the lack of information from Looking Glass.

That should not be given this is a “developer’s program”, this information should be supplied so we can work up with things by ourselves, if they really want to support “OpenGL developers” and not just Unity they should be a bit more forthcoming with information and support.

I am used to work with companies where “you would have signed an NDAs/something and you would have been given proper info”, this so far seems not to be the case and I can’t say I am very happy about.

I can understand maybe Looking Glass does not want to make some information public, but this is a “developer’s program”, we are not here to steal the looking glass tech, we are here to make it work and create contents for it.

I would expect “a bit more support” from Looking Glass rather than trying to figure out things by my own “because someone is not supplying proper info yet” this is “not so good” if I can say so.

I understand that in the end of the day “it’s not rocket science” BUT I can’t see why I can’t be properly supplied the info I need, I am not very happy so far to be honest but I’ll keep working on because I really love this device and I can see lot of potential in it.

But I do hope someone at Looking Glass will start considering a bit more people who actually writes their own code and just want to make things work properly, I have to say “I want a bit more from them”.

Anyway thanks for your help, I may try to but again “this is not the best way of doing” they could have put a bit more explanations and the shader listing as well.


To be fair, I reversed engineered for fun, not out of necessity. I wanted to understand how the optics worked so I did that by writing code until I had a lightfield image. Looking Glass is working on SDKs and integrations. The Unity SDK is out, there’s a Three.js library, and the OpenGL SDK and quilt viewer are in beta. You’re asking for “a bit more support” after posting on a Friday and getting some lively discussion from users. Give LG chance to respond.


Sure, I did not say NOW but I am saying “needs to be a bit more supported”, fact is “I get a bit tetchy” when I get the feeling that “Unity is always the thing that gets it first” ( and maybe even the last ) as if “it’s the only thing that needs to be supported”. Anyway it’s been some months this stuff is out I can wait more, I am not in a super hurry. What I am saying is I just hope they won’t forget “there is not only Unity in the world” even if, to be fair, I am not sure how many the “non-unity users” are.
It gets a bit on my nerves they put out this “OpenGL thing” that is not sufficiently explained, I would have preferred at this point to have something “later” but “better presented” because as it is is a “guess hit and miss”. To be honest I can’t make any use of the Unity thing if not as you did to try to study and work something out, that’s the only use I can possibly make it of. Likewise that Three stuff I can only use it as a way to figure something out, it’s not of any other practical use to me.
Anyway looking forward to what LG will reply, meanwhile I’ll try to work something my own and/or for the moment move into something else.


Hi Giles -

Totally understand (and share, tbh) your frustration, most of us in the lab are coming from a Unity-based graphics programming background, and thus part of the reason why we’re running the C API as a closed beta program is in order to gather feedback on what is necessary to fully support professional users who want to do 3D development on a lower level.
We’ve started dogfooding the C API and have an example scene that’s all “worked out” built and set up - will hopefully be pushing it out into the world in the next few days.
I am not sure what is causing the crash and I totally agree that it’s not good that our users have to reverse engineer our own tools, but until we can bring the documentation of our other libraries to parity with the Unity library, I can give you a couple of quick bullet points that I hope will help clarify some of your technical questions.

  • The Looking Glass enumerates as an HDMI monitor and a USB joystick. Calibration file is loaded via a quasi-covert channel via HID. The Unity library uses HIDAPI to read off a json string that contains a couple of floats that correct for micron-level physical variations in display lamination.
  • In order to poke USB packets from sandboxed browser permissions, I wrote a driver that serves up the calibration over websocket. To view the entire calibration string once the driver is installed, you can use this test page.
  • The most relevant code to poke around in if you’re interested in how the system works is Lenticular.shader (which is the actual shader code that converts the quilt image into a lenticularized display), Quilt.cs, and Capture.cs in the Unity SDK. But - our C++ example scene will be released soon and hopefully that will help to clarify and debug your issue, both on our end and yours.

Again, thanks for participating in our beta program and I’m sorry that the available resources have come up short, but please know that we are listening and that your feedback is immensely helpful to us as we flesh out the features and documentation we’re making available. Feel free to ping me if you have any more questions (though I’m on a flight for the next 16 hours) and I hope none of these frustrations stop Llamasoft games from ending up in the Looking Glass! :slight_smile:

1 Like

I could say like in that movie “I don’t like the looking glass, I LOVE it !” … you are talking to the person that in some 199x saw an article on some IEEE about a Texas Instrument thing called “voxtron” and nearly tried to put some spinning motor and some LEDs connected to a ZX Specturm. … but it did not go that far …

For the first time in my life I have on my desk a device “that does what it says on the tin and it does well”, I have to make it work … I HAVE TO MAKE IT WORK !! Full stop ! That’s it, you made me get “that thing” in my brain “there’s no peace until I make it work” !

I have so many ideas of use and they are just “some code and shader away”.

Anyway … I am on a flight too for the next some hours, going back home for a week, I’ll try to do something “remotely” at least see if I can make that shader work.

Your DLL I think “has a number of problems” but no time to discuss them here now.

Have a safe trip, talk you later.

Beside I almost “convinced another few people” into Looking Glass :smiley: … but again it’s really imperative that we get this opengl working.

Unity is a blessing for some and a curse for some others you can never have it fully win with that :slight_smile: