Hi, im working on Making the code in Kinect into opensource code..
For that i needed a graphics pipeline that would enable me to render different poses for different characters.
I use make human to make the different characters (i love it)
I use blender to import the characters (in MHX form) and then use the loadRetargetSimplify Option to apply selected poses from bvh files onto my characters.
The issue im facing is that I needed a mesh that had each vertex labelled with one of the 31 body part labels that i defined. So i used mesh names from the OBJ format to label all vertices with one of the 31 labels.
The problem really lies here, that the coordinates i export from blender using the MHX rig are very different from the ones that i label initially from makehuman.
I have adjusted the difference in orientation of the characters and made the characters lie in exactly the same position but still the coordinates are visibly different.
Im guessing the mhx form does some kind of smoothing on the coordinates that the obj files dont. So, can anyone please tell me how to get rid of this smoothing so that i can write a fast algorithm for matching of the coordinates from the two files..??
Thanks guys..
I'll be posting the entire pipeline for labelling meshes in the repository on my project page.
Do let me know if someone finds the code i've written so far useful..