Hello all,

I had done a lot of research on interacting with holograms on our own display. Since we saw the Holoplayer we immediately changed but now I have a lot of my interaction still based on a leapmotion.

Has anyone tried to use a leap for hand and gesture recognition with the Holoplayer?
Any tips or tricks?
My main problem right now is figuring out the different scales. My real/physical hands and my virtual hands inside Unity are not matching in terms of translation.



We used the Leap Motion pretty extensively with a previous generation volumetric display we developed in Looking Glass, and a bit with early versions of the HoloPlayers. However, since we couldn’t get the full depth map out of the Leap, it was limiting enough for us that we were forced to move to the Intel Realsense.

I don’t think we have any good projects using Leap any longer. @oliver may have something in his archives.


Hi again!

We made an experimental AR version of the Holoplayer in the lab (where the 32 view superstereoscopic 3D scene is pushed behind a beamsplitter, instead of over it) and my kids and I were playing around with 3D interaction at a distance. While I prefer using the Realsense built into the Holoplayers, I found that my kids couldn’t use the standard Holoplayer 3D demos at a distance – it was just too weird flailing around with 1 foot between their hands and the hologram. But the idea of making something that my kids couldn’t use, right around Christmas no less, was giving me nightmares. Fortunately, I had a Leap Motion buried under papers on my desk and I dredged up an old Leap demo with good hand visuals made by the folks at Leap and @oliver, and into the HoloPlaySDK and then into the experimental AR Holoplayer it went!

It worked! The hand visualization is key for getting kids (in this video Jane is 7 and Ben is 4 years old) to immediately understand how to interact with a 3D world that they can’t directly touch (as they can with the standard version of the Holoplayer). They just walk up to the system and know how to use it. I still think the Leap Motion itself is too flaky for any 3D interaction outside of controlled tech demos, but the hand visualization piece of this may be essential for certain interaction modes. I think we can build this into a toggle in a future release of the HoloPlay SDK (using the Realsense SR300), for experimental at-a-distance interaction in Holoplayers and their AR cousins. @kyle and @dez WDYT?


As far as matching the scale, I believe you should be able to parent the leap motion hand to a an empty node and scale it to whatever is needed. However, I haven’t worked with leap motion in some time and don’t remember exactly how it works (or it’s quirks).

One question I have for you is: Is just the appearance of the hand the important part of your app? Are the physics (colliders etc) of the leap motion hand important at all?

I ask because (as @smf suggests), in the next version of our SDK, it will be possible to render out the detected 3D view of the depth camera in 3D space with 1:1 scale to the real world. But I don’t know if this would solve what you are trying to do.


Thank you both for the reply.

@smf That is a great implementation, I can’t wait to have my little girl try these things out.
To be honest, my main drive towards the LeapMotion is that I’m not the best coder and I already have some interaction coded on the LeapMotion and nothing with the RealSense. I have also heard that gesture recognition on RS is not that reliable, what do you think? If I want to detect pinch and point and a virtual sphere that follows the radius of the curvature of the palm, would you go with Leap or Real S?

@dez I later discovered that you can change the scale of the Holoplayer’s viewing box in Unity and with this I have been able to be closer to the Physical World scale.

I have no interest in the appearance of the virtual hands, other than for debugging my alignment. In the end I would want the user to not need to see the virtual hands since they would theoretically be exactly under the physical hands as you interact with the virtual objects. But my interest in Leap is how they have already solutions for coding gestures like pinch and point, I’m sure this can be done with RS camera but I would have to invest more time in figuring that out than what I have spent getting the leap to work with Holoplayer.

I like the idea for rendering the detected 3D view as 1:1 with the physical world, this is the crux of the thing. Once we have that, the other thing that will help is a virtual representation (in Unity) of the physical volume of the Holoplayer (basically a 3D model of the actual device). This is because I have to position the physical Leap at a point with relation to the Holoplayer device and at the same time place the virtual Leap in Unity at a location relative to the 3D virtual view. (did that make sense?)


hey @titoalfaro

here are some of my thoughts on the different SDK that might help you choose the best direction:

As far as getting everything 1:1. The current HoloPlaySDK does a pretty good job of mapping your detected fingers to camera relative local position. Future versions will be even more precise.
The issue now is gestures. If you want to use Leap, it does have all the gesture stuff already and it isn’t bad, but then will require you to calibrate everything into the holoplayer transform and scale. I haven’t tried this so I don’t know how it will do. You will have to find the best place to put the leap as well (looking down from the top, or on top of the realsense?).
Realsense native implementation does give gestures, but I think it is ‘meh’. Also you would have to calibrate to anyway holoplayer if you use this.

Our implementation of realsense in the HoloPlaySDK already gives you calibrated 1:1 touches but doesn’t give gestures (point seems straightforward however, as any finger detected is essentially a point). But pinch is still in progress.
I did a very brutish implementation of pinch, but I don’t think it is really ready for prime time yet. I believe it is this boolean that you can access statically (from anywhere):
depthPlugin.instance.closedHand; //it should tell you if the hand is open/closed

but again, this open/close/pinch is a rough implementation. I plan to improve it later.

Also I don’t understand " If I want to detect pinch and point and a virtual sphere that follows the radius of the curvature of the palm, would you go with Leap or Real S?" Can you explain a little bit more and maybe I can think of a path to recommend.


Oh also note that it may be the case that realsense and leap may interfere with each other if they are both on at the same time onto the same tracking space, as they both throw out their own infrared beams to function.


Oh wow, this might explain some issues I have seen.

Any way to disable the Real Sense?


“Also I don’t understand " If I want to detect pinch and point and a virtual sphere that follows the radius of the curvature of the palm, would you go with Leap or Real S?” Can you explain a little bit more and maybe I can think of a path to recommend."

In my designing the interaction with mid-air virtual objects, one of my objectives is to have a precise way to select either one object among many or select a whole group of objects. I want to try to do this by taking advantage of something that leap offers which is the sphere_radius of the hand https://developer.leapmotion.com/documentation/java/api/Leap.Hand.html#javaclasscom_1_1leapmotion_1_1leap_1_1_hand_1a37a58aedfe78c6f083f5253de5a82548

I think this got dropped in later Leap SDK Versions but there is still a way to get that information from the palm.


As far as turning off real sense. Simply disable or never enable the depth plugin component.

As far as the volume selection feature. I can’t think of an existing way to do exactly that in our sdk right now. One alternative would be a 3d drag window select. That could work well with our existing toolset.
Another slightly different approach could be to process the depth data from the plugin yourself and get hand volume info from that. I could teach you how if you want. I don’t think it would be crazy.

Other than that I guess you are left with leapmotion.