Page 1 of 1

iPi Soft Animation Software, MakeHuman, and clothes help

PostPosted: Sun Jan 25, 2015 4:17 pm
by devrich
Hi everyone,

If I may please; as I am new to creating "my own" character models; I have a couple questions about the process before I begin so that I get things right sooner instead of later :)

1: When modeling clothes and gadgets from scratch; is there a particularly accepted method for attaching them to a character's rig so that the character's underlyning skin doesn't pierce through during animation or kinematic changes? ( if it helps to visualize; imagine a small team where: Artist_A does animation, Artist_B creates the character models in MH, and Artist_C is responcible for combining everything together so that motion capture specialits can bring them to life )

2: if memory serves; iPi uses a specific rig for their animiation software. Does anybody have experience in bringing that TRS animation data and applying to a MakeHuman rig? ( this would be my first time and I'm willing to wager that the two Rigs are very different in design )

3: Facial animation; I truely haven't ever done this before in my career and I would love to get started if someone could point me to a tutorial or forum topic, I would much appreciate it

Re: iPi Soft Animation Software, MakeHuman, and clothes help

PostPosted: Sun Jan 25, 2015 4:47 pm
by duststorm
1: The MakeHuman clothes proxies are rigged too, so if you create clothes for MakeHuman using MakeClothes, the resulting clothes items will be automatically rigged (with the option of defining a custom rigging -- this all is pretty new and as of yet undocumented and unreleased in a stable version).
What I prefer to do is hide the skin geometry that sits under the clothes, which is the main cause for intersections during animation. This is what MakeHuman does when exporting clothes if the "Hide geometry under clothes" checkbox (enabled by default) in the clothes tab is selected. In fact, it completely removes the hidden geometry from the exported file, making the file lighter and making the mesh more performant for realtime rendering.
If you require watertight meshes, you could try a poisson remesh to close the holes between body and clothes again, might work.

2: I don't think we have created a rig for it, but if you have an example IPIsoft rig (for example as a BVH file), feel free to create a feature request on the bug tracker, and attach the file as example.
Alternatively, the MakeWalk mocap retargeting plugin for blender could be extended with an ipisoft source rig definition file, so ipisoft mocap data can be transferred to MH rigs. Perhaps there is already a source rig that works, I have not yet tested this (we have source definitions for the CMU, accad, and some other common mocap format rigs already in place)

3: Facial animation is not that easy to do by hand. We are working on a new controller rig that will allow you to blend different predefined expressions together. That will be released in the future.
Another way is using facial mocap. There's a few methods, like with a simple webcam and face markers. I believe Blender implements this approach, but I have not tested it yet.
I have experimented with the software FaceShift, which works with a Primesense camera (like a kinect v1 -- perhaps v2 as well, an asus xtion or a primesense carmine). It works pretty well. On our blog you can find a demo video where we attached the MH face rig to it.