For the animation of my character I wish to motion capture it. The animation will be taken from a dancer so they need to be able to move freely. I did consider using the exoskeleton motion capture suit here but it does not allow the performer great flexibility and a very accurate recording.

So I’m going to give IPI Soft a go. It is marker less and can be done with web cameras or Kinnect cameras.

I intend to have a play with this software before doing the real thing, as I will need to learn a few things. There is a human IK preset rig in Maya that I can practice with and some pre-captured data on the website I can use. The website seems really helpful.

Mocap Test

I did a mocap test with Annabeth, showing me how to use it , what the result would be and is it cost effective in terms of time. Do I use Mocap or keyframe.  The results I got were good, its fascinating stuff, a definate interest of mine. I could see the results developing quickly.

We used one camera and got Andy to jump around a bit so we had some footage to play with. But before that we captured the background plate to tell the software that that isn’t the area of interest. Then recorded Andy in t-pose for a couple of seconds which is needed later on and did about 6 seconds of movements and saved the scene. Once we had that data, we sent it into IPI studio and imported the scene we had just recored.

The t-pose is important, it is used to align the skin in IPI studio, with the depth map data recored. The straighter and a more accurate the actor can create a t-pose the better and quicker the skin will retarget the data. You can then track forward in your area of interest creating the keyframes onto the skin that contains a skeleton. Using a free rigged model from the internet we tested it out.

It was quite jumpy, but we added jitter removal and it was a lot smoother

It worked quite well, but the one thing i really took from this was the naming convetions. In order to apply the right data to the right bones you need to name your bone clear and relevent. The model we imported had quite bad naming conventions so we weren’t able to correctly match them up. But we made do as it was a test. Once that was done we tested it in Maya exporting it as an FBX from IPI and importing that to Maya.

I was expecting it to be quite buggy and glitchy but it worked fine and I was quite excited about the result.

The only thing was the tracking time, when it is calculating the data onto the IPI skin. It was going to take a long time, but i still think that mocap is the best option for me. key frame rotomation will possibly take me longer and i wont get as good effect, im not a great keyframe animator and i really want to explore Mocap more.