Linux Support


#1

Will there ever be such a thing?


REST service with per-device calibration data / Documentation of non-HID USB interface
#2

Hi! We’re hoping to eventually roll out Linux support, but we have yet to scope out what kind of resources we would need to do so. May I ask what in particular you would like to use on Linux? The Unity SDK or the apps themselves via the Library?


#3

(this is a link but i am a new user i cant add more than 2) elementary dot io (i am running elementary os juno) I am trying to follow https://forum.unity.com/threads/unity-hub-release-candidate-0-20-1-is-now-available.546315/ and https://forum.unity.com/threads/unity-on-linux-release-notes-and-known-issues.350256/page-2#post-3625345 to get unity up and running. i have the hub installed and it looks like it should (im coming from macos to linux now) but after i create a new project, unity crashes. Still trying to figure out how to get it up and running with no problems. im running all this on a razer blade late 2016 with a 1060 nvidia gtx.


#4

I’d be far more interested in seeing Quiltr and the support library for parsing the lightfield calibration descriptors. Something basic like hanging a Raspberry Pi off the back of the Looking Glass could be great.

AFAICT the basic information is just what angle each subpixel is diverted to in a focal plane. The tricky part comes in rendering for that sort of perspective; a common angle means you’d want a vertical perpective divide but orthographic horisontal… have barely started thinking about it. This thing is neat.


#5

Hi - I’m new to this technology, and an artist, scientist, and programmer, but I don’t have Windows or Mac operating systems. I wouldn’t need Unity or any apps - I simply want to know if I’ll be able to make my own programs to send signals to the device.


#6

Hi all, in terms of getting the HoloPlay SDK for Unity into Linux, we don’t yet have plans to do this as Unity itself isn’t particularly well supported for Linux.

However, we will be releasing a closed beta of an OpenGL DLL in January along with documentation. This could allow you to create programs outside of the context of Unity. Unfortunately, we don’t have a road-map for making this functional on Linux - it will still be Mac and Windows only. If there’s sufficient interest from the community, we will do our best to make Linux support happen!


#7

Thanks for the response! So it’s not as simple as a smart programmer putting the right pixels in the right places and sending the right codes to the device via USB? I’m familiar with off-axis frustrum projections, socket programming, shader programming, etc. and it would be a shame if I have to ignore this hardware, too (along with many other VR devices whose creators chose to ignore Linux creatives).


#8

It is as “simple” as that. It’s already displaying and the subpixels are optically rerouted. It’s knowing where that’s the trick; while we could develop calibration procedures as well, each Looking Glass holds calibration data already. I haven’t looked at the format yet, but it’s bound to be a HID device and we could dump the readout procedure using e.g. wireshark and usbpcap.


#9

Linux support would be great please. I don’t fancy having to reverse engineer the calibration data from scratch. XD


#10

Making slight progress. I have a Python script that reads the EEPROM calibration data. Next step is reproducing the subpixel mapping… the existing shader looks rather inefficient, mainly by performing operations that should be done once in the fragment shader (including division). Also, fmod(x-floor(x),1) is normally spelled frac(x).

Side note, Unity itself seems to be malfunctioning completely in my Windows installation. It just never displays anything (not even a window) when I try to open a project, and that’s before even adding HoloPlay. They also emailed me my password as if it was my name.


#11

Tiny updates. Today I verified the buttons work with lsinput and input-events, and I’ve learned how to make Blender produce multiview renders (preferably in a single EXR file). Messed about with image loading libraries and I think my next step might involve making Blender show something interesting; it should be possible to autogenerate alternate view projections using offscreen buffers and from there build a 3d viewport for the Looking Glass.

Simple recipe to make a fullscreen viewport: Ctrl-Alt-W to make a new window, Alt-F10 to fullwindow the viewport, Alt-F11 to fullscreen the window. I think I’ll identify the viewport by making a screen named LookingGlass.

Early experiments with making a camera array didn’t yield a convenient method to name and place them all, though I was able to add automatic lens shift by using a driver from local X coordinate.

It’s possible the code at https://github.com/lonetech/LookingGlass might interest someone. It’s extremely slow but does demonstrate two distinct views, at the moment. You’ll have to mess with it to transfer the JSON calibration data from one part to the other. Rough proof of concept hacks still.

I got the gpu.offscreen sample overlay running, which does one offscreen view. Extending it to multiple views and covering the area should be reasonably easy. The tricky steps will be: modifying the matrices for distinct views, mapping all the views for a shader to use, and writing and connecting a shader to do the subpixel mappings (proof of concept is ridiculously slow Python at the moment).

Another project would be to use mpv’s shader support to add a quilt remapping. I’ve fetched the 5x9 quilt videos from vimeo for testing that.


#12

I got the mpv shader basically working. It should be combined with a metadata extraction script to find the calibration for the display and tiling for the video, and probably scale the view angles. Most of the vimeo videos were 5x9, the others 4x8, and I don’t know yet how to detect which. The shader might possibly work on e.g. raspberry pi, but I haven’t tested it. There’s certainly nothing OS specific in it. Source pushed to github.


#13

I’d be interested in seeing Quiltr support on Linux. I want to take a pre-rendered scene and get it on Looking Glass. I’m not interested in plugins for specific 3rd party programs.


#14

Has anyone been able to get their LG working with RedHat? lonetech, I tried your scripts and config data, but they seem to need Ubuntu. Currently my workstation doesn’t recognize the LG device.


#15

There’s nothing distribution specific in the scripts. In particular, I haven’t tested on Ubuntu, the mentioned package names are from Debian. As for recognizing the device, you can check with lsusb and xrandr if it is accessible as a USB device and monitor respectively.


#16

Actually Unity on Linux works great.

What I would like to see is a real time composter for KDE Plasma, a plug-in for Blender would be a great place to start.

What pricing would you offer Looking Glass to someone who can develop code in a Linux Environment ? I have a NUC Haydes Canton running Ubuntu 18.10, driving a 4k monitor and an apple ips monitor that I converted to a photo frame. I also have an Nvidia gtx970 egpu. I would drive the Looking Glass with a thunderbolt3/HDMI connection.

For a standalone display I have a NUC Bean Canyon

I am currently experimenting with a trapazoidial display and blender. I am using a 10" ips display and an acylic trapazoidial prism. The actual viewing area is 3" x 2".


#17

Hi all, just to update you on this (and thank you for being patient as we figure this out on our end) we are planning to release Linux support via our HoloPlay C API by the end of February/mid-March. This system uses the same calibration loading pipeline as our Unity SDK and so can be used to make Linux support for that pipeline. However, we won’t be supporting our Unity SDK on Linux, so there may be some issues that come up with it!