Here’s another experiment from our human-workflow. Taking mocap onto our generic human rig, doing facial motion capture and then cloth sim. It was a two day process and the third day had it baked out from Houdini and into the excellent vertex-to-texture animation process for Unity to feed on. This was punched out to HoloLens as a gag (https://www.instagram.com/p/BuqR_2Rl6JR/) as HoloLens has not much power to do anything it was great to see me/him spinning around floating in the middle of the studio. It’s actually full 30fps frame rate in the HoloLens despite what the video seems.
This morning I took the same data set and updated to the latest Looking Glass SDK to have him tumbling around in our displays!
Here’s a link to check it out :
(apologies again as there’s branding in there. As I build up a few of these i’ll make a formal app with no branding and have the buttons move between the scenes)