I’m having a blast building some of my first holographic apps and hope to be able to share with you soon. Right now, I’m still in the new car smell phase and trying out all of the things I’ve been daydreaming about since I first backed your Kickstarter.
I’ve had a Leap since I first backed them in 2012 (!) and was excited to finally use it for something appropriate. I know it’s possible because I’ve seen Froggo and some of the other demos. However, I’m new to Leap development and actually getting it working with the LKG camera rig is proving more difficult than I’d anticipated. Specifically, my hands show up really tiny and changing the scale just seems to make it glitch out. I lost a few hours to increasingly frustrating experiments but made no real progress.
Part of the problem is that there’s too many ways to integrate; it seems like they have multiple layers of abstraction with their “core” and interaction system APIs. Different examples seem to make use of entirely separate techniques. Maybe I’m having a grumpy old man moment…
Here’s what would do it for me: if someone could set up a downloadable Unity sample with a scene preconfigured to support a standard LKG setup paired with a Leap setup. Visually displaying the hands is optional and I’ll likely turn them off. Ideally, there would be sphere colliders on the fingertips or an obvious place to attached a component or child object; I don’t need any gesture or object pickup/manipulation. I want to inject force into to scene with my hands, like in the Blob Pack.
Heck, could you just post the code for Blob Pack or Froggo? @oliver you know you wanna.