http://youtu.be/3x7cvZp3CtE
First test with facial mocap. I should have deleted stray keyframes on the left eye, but I didn't want to spend another hour rerendering.
The workflow essentially follows Sebastian Königs vimeo tutorial "How to use face tracking data in Blender" (http://vimeo.com/24016505), although most of it has been scripted. The scripts can be found in the sandbox in MakeHuman svn (utils/tools/sandbox).
The mocap data is sample file 6 from Facemotion (http://www.easycapstudio.com/?q=support/sample-files). A drawback of the method is that there is a one to one correspondance between markers and bones in the face rig. If we want to use data recorded with a different marker setup, we need a different face rig.