Hi there,
I'm trying to make a MakeHuman model talk in Unity.
For the lipsync control I want to use Annosoft lipsync, it works by recording the bone locations on 17 frames (each containing different mouthshapes like o, a, e, ...) and then blends those shapes with a script.
More info and annosoft unity package w working fbx model here: http://forum.unity3d.com/threads/lipsyn ... er.123701/
So I was hoping I could use MHX2 facial drivers to create 17 frames with mouth shapes for talking. (like this: http://www.annosoft.com/docs/Visemes17.html)
I used the sliders (checked drivers on import) and right clicked the values to set keyframes. It shows up in the timeline and on the curves panel under 'subAction', but when I import this model in Unity (as a .blend or .fbx) it doesn't show me any animation... Is it because the keyframe info is being lost, since it targets the mhx runtime?
Are those drivers actual bones or are they procedurally controlled or something? I'm still a bit new to MH and 3D in general so I'm a bit lost as to what I can do now.