Page 1 of 1

Facial Mocap AI Audio2Face Example

PostPosted: Fri Dec 02, 2022 7:29 am
by DrHas this link will take you to video.

Ok, I realized I was replying to my own post a few weeks ago lol, but as promised here is the 1st animation I created using facial mocap, Makehuman, Faceit and NVidia's Audio2Face. Of course Blender too!

The work or the facial mocap needs refinment...I mean, I am using Makehuman in Alpha form, and Nvidia's Audio2Face in Beta form, and my learning style is make it up as you go to learn or reverse engineer shit, but as you can see if you watch said works.

The animation I created uses digital charecters I create using Makehuman. I believe this practice is quite advanced now. I like to think of it as ethereal sculpturing. I have 4 degrees including an Master of Phillosophy in Fine Farts. Don't take me too seriously lol. My work though, I am serious about. But like a painter, sculpture and or an illustrator...I take front profle and side profile pics and whittle down the Makehuman mesh. I've been working on skin, and better all round texturing in Blender too. It doesn't matter if I use the premade Makehuman assests, which most are fucking awsome, or indeed if I create the costumes using Simply Cloth Pro which is a Blender Addon. An example is the devil costume, rather than use the Principled BDSF shader, I like the velvet shader. The skin shader too, which is a basterd version of a purchased one on Blendermarket, combined with Makehumans skin shader setup. It looks like spaghetti mind you lol. I also use the ACES colour system. I haven't moved to Blender's new ACES colour system however, because it seems to lack the extended settings you got from the Frankenstein way you had to do ACES before. I might be missing something though. I color grade in Linear SRGB. It seems to work, if you like bright fairly realistic colour and light. You might want another aesthetic!! More art wankery!!

The animation is a music visulizer I created the compostion. I sing, play and produce the whole track.

Everything is at the highest standard I can get from my machine...this is 4096x2160 and 30fps. Although the loop section was filmed at 100fps off the top of my head. This was for the slow motion effect. I've also employed a lot of my 'real' world cinematography knowledge. So massive sensors on the virtual camera system...IMAX, and some almost fisheye lens stuff for exaggeration here and there.

Make sure you like and share the vid if you watch, and maybe subscribe to my Youfool channel. I will finish the how to facial mocap stuff I've been working on in a few days. And I am constantly revising these ideas and techniques. I have got FBX motion capture apps to work with Rigify and Makehuman too. I just need to refine this a little more before I employ this motion capture technique. I know I need to work of the fur a bit more too.

Cheers Ben