Mocap Mapping
Here’s a glimpse of my latest project with Motek Entertainment, NuFormer and Creative Technology Holland.
A live interaction between an audience and a video mapped projection of a 3D character.
Here’s a glimpse of my latest project with Motek Entertainment, NuFormer and Creative Technology Holland.
A live interaction between an audience and a video mapped projection of a 3D character.
It was great work! I was really impressed.
I imagine your application consist of Unity game engine receive a character’s motion data from MotionBuilder in realtime.
Did you make a custom plugin for MotionBuilder?
What protocol did you use? original or something open source like VRPN?
Yep you understood it pretty thoroughly, that was pretty close to out setup.
In the future we’ll integrate everything directly into Unity, but for the first show we used a bit of MotionBuilder on the backend so we could build a fully functional prototype quickly.
Indeed I created a custom MotionBuilder device to get the data out and into MotionBuilder over a network socket.
I didn’t use VRPN (but I’m gonna check it out now you mention it) but just passed a bunch of numbers over a UDP port.
are you using a mocap system like optitrack arena to stream realtime mocap data ? or this prerendered avatar and build into the general video ?
The character was done completely live in the Unity game engine, using an XSens mocap suit.
Same for the shadow he was casting on the environment and the elevator & door he was interacting with, all in realtime.
The backgrounds were pre-rendered and everything was mixed live in a hardware video mixer.