First Person Camera Rigging & Far Clipping Plane?


hey everybody! i guess kickstarter backers are getting these now because i just got mine… im pretty stoked about the possibilities, buuuut admittedly i dont really understand it! so haha, here we go with my post
i dont have any experience with unity, my first order of business is to rig the Holoplay Capture object which represents the lookingglass viewport to the traditional movement styles of a first person game… mouselook for rotation, etc, this doesnt seem like it would be really complicated…

however? i realized when looking at the example demo scene that objects past the far plane arent shown at all…

which brings me to my main question here… what object or other settings can i change in order for the view distance to be increased beyond that far distance? [actually i just realized that the far clip factor does this… it only goes up to 6 and faraway objects begin to get blurred. i understand it’s probably some innate limitation of the technology but if somebody could explain the physics or mechanics behind this difference in stereoscopic the looking glass that would be awesome]

[also when i was changing the Size of Capture i noticed it only changes in integer amounts? holding ctrl alt shift modifiers dont allow the slider to be fine tuned… any unity ui specific advice in this regards? i know alot of people do scripts and assets and such]

anyways if anybody here later has any example scenes and pointers to help everybody with this “first person camera” line of development itd be really appreciated… by the way voxatron on this is awesome! and the future crew demo that was remade for lkg is phat yo
thanks friends!"""


Hi there! First off, the size of the HoloPlay Capture can be adjusted finer than the integer level, it’s just that the UI slider increments in integers because the range is quite large. If you type into the text box a more specific value, that should work just fine.

Now, in terms of your question about the blurring of faraway objects, that is indeed a limitation of the technology. This happens because there are many images being captured (typically 45) and their frustrums converge at a specific plane (which we call the “focal plane”). That plane, which is indicated by the blue corners, is where images will be sharpest. As content moves away from this plane, the difference in what each camera “sees” gets greater, and so the image starts to pull apart.

Implementing a first-person game in the Looking Glass will be more of a challenge than it first appears. You may be able to leverage some conventional techniques like linear perspective to give the sense that the scene continues far into the background. This could either be some kind of skybox or as simple as heightening the FOV of the HoloPlay Capture. I’m not sure how effective that would be, but it’s something I’ve wondered about to myself on several occasions. The other slight challenge will be changing the way rotation works, as the Capture rotates around it’s center rather than from the front. This could be changed by childing the capture to another object and setting that parent object’s position to a more ideal position near the near clip plane.

As for example scenes, we don’t have much in the way of that now but we’re working on getting some out there in the coming months!

I hope that’s somewhat helpful, let me know if you have more questions.


There aren’t a ton of examples of first person content for the Looking Glass because of the reasons Alex mentioned, but someone did port the original Doom to Unity if you just want to get a feel for how the clipping planes work in a first person context.