As an independent film director, storyteller and animator, I often find myself needing tools that can help me achieve multiple animation goals within a short time span and within the confines of a limited budget. Motion Live for Reallusion’s iClone has been that kind of solution for me and my team, since I began using iClone in 2015. Reallusion has created a flexible solution that combines the best hardware on the market with their 3D animation software.
In my recent work, I have used the Perception Neuron 2.0 motion capture suit together with the iPhone X. The Perception Neuron suit is a wireless solution, and coupled with an iPhone X mounted on a head mounted camera rig, you have a solution that frees you to act out animations without being tethered to a chair or confined to the length of a USB cable attached to your computer.
Reallusion offers both the Perception Neuron plug-in and the Live Face plug-in, which make it possible to stream data from the mocap suit and an iPhone X. With this kind of freedom, I am able to act out the different parts for my characters and create banks of animations that I can reuse for the diverse characters in my animated children’s series.
At first, I thought the quality of the data would need a lot of cleanup, but I was pleasantly surprised when after a preliminary run-through, I could see the characters doing exactly what I was acting out in realtime inside iClone. I could also see before I recorded the final animation, what areas I needed to refine.
What you see in the above video is raw unedited data, which means that Motion Live gets you much closer to the end rest even with the default results. You will need to spend some time in the iClone curve editor to refine the animation.
Reallusion’s solution doesn’t depend on an iPhone X. If you don’t own one or prefer a different technology, you can substitute Faceware’s iClone realtime plug-in to capture facial performances.
Something to look out for: Make sure you enable smooth head in the Face Live dialog box, if you are capturing just the head, but if you are capturing both the body and face together, then choose the Perception Neuron as the source of the body and head movement under the Motion Live panel.
Motion Live will not in itself be the final solution for all your animation needs: You will still need a good animator to add some keyframe animation wherever it is needed, and I encourage you to invest in performance actors that have an idea about playing out the roles you have in your story. A great mocap actor will save you many hours of animation time.
The Perception Neuron suit is accurate at capturing whatever the actor is doing, so if the actor is giving a sub-par performance, that’s what the mocap data is going to end up looking like. The same applies to the facial performance capture. Encourage the actor to be as expressive as possible in the face and record the session in an environment with diffuse lighting, and away from harsh lighting which causes stack shadows. The depth-sensing camera on the iPhone X works best in a diffuse light setup.
Motion Live and iClone is a solution that all my fellow storytellers out there should give serious consideration. Reallusion is at the forefront of products that are democratizing mocap with a flexible offering meant to appeal to a broad swath of users.
The bottom line: Anyone who has faced aggressive deadlines will appreciate the ability to make such choices on the fly, and the price-point is sure to appeal to studios both large and small.
Solomon W. Jagwe (www.sowl.com) is a Ugandan 3D artist and animator based in the USA. He is the creator and director of The Adventures of Nkoza and Nankya (www.nkozaandnankya.com), a children's animated series and book series celebrating Ugandan and African culture and languages through storytelling.