Look who's talking

Works in progress and technical screen shots.

Moderator: joepal

Look who's talking

Postby ThomasL » Tue Dec 31, 2013 10:32 am

http://youtu.be/3x7cvZp3CtE

First test with facial mocap. I should have deleted stray keyframes on the left eye, but I didn't want to spend another hour rerendering.

The workflow essentially follows Sebastian Königs vimeo tutorial "How to use face tracking data in Blender" (http://vimeo.com/24016505), although most of it has been scripted. The scripts can be found in the sandbox in MakeHuman svn (utils/tools/sandbox).

The mocap data is sample file 6 from Facemotion (http://www.easycapstudio.com/?q=support/sample-files). A drawback of the method is that there is a one to one correspondance between markers and bones in the face rig. If we want to use data recorded with a different marker setup, we need a different face rig.
ThomasL
 
Posts: 1139
Joined: Tue Sep 15, 2009 2:46 am

Re: Look who's talking

Postby Rhynedahll » Tue Dec 31, 2013 8:56 pm

Very interesting.

Do you plan to develop this into another tool?
Orphan, an epic fantasy by H. Jonas Rhynedahll
The Key to Magic: Book One
Available on Kindle at Amazon: The Key to Magic
Rhynedahll
 
Posts: 195
Joined: Sat Nov 28, 2009 1:23 pm

Re: Look who's talking

Postby ThomasL » Wed Jan 01, 2014 5:36 pm

There should be some kind of face rig in alpha 9, but we don't know how it will look like. This is just an experiment.
ThomasL
 
Posts: 1139
Joined: Tue Sep 15, 2009 2:46 am

Re: Look who's talking

Postby duststorm » Fri Jan 03, 2014 1:18 am

In any case the result of this test looks promising.
I hope we can make it work with different types of faces and good visual quality.
MakeHuman™ developer
User avatar
duststorm
 
Posts: 2569
Joined: Fri Jan 27, 2012 11:57 am
Location: Belgium


Return to WIP (Work In Progress)

Who is online

Users browsing this forum: Google [Bot] and 1 guest