(1) I do not find anywhere a list of libraries providing human avatars that can be animated. Of course, I am aware of the existence of the MakeHuman. This project was exactly what I am looking for, but the development was interrupted and left incomplete a few years ago after version version 0.9.1. After that, the developers shifted a little bit their target and started developing a new version (1.0) from scratch, in python. It seems that it is not possible anymore to animate the avatar directly with the software provided by the MakeHuman team. Since I need to animate the avatar by myself inside my application, blender is not a solution for me. I am not sure if using the old version (0.9.1) of your project (even if it does nearly exactly what I need) is a good idea since it not supported anymore. Thus, is there any alternative? The “only” thing I need is to provide the pose parameters, the morphological parameters, and to recover the avatar's geometry (e.g. a mesh) to perform computations on that.
(2) I am also wondering if there exists libraries to apply a pose to an avatar. On the one hand, the pose parameters of MakeHuman 0.9 are related to angles, some of them expressed in degrees, but not all of them (note that I observed that for some joints the unit used for positive bending if different that the one used for negative bending). On the other hand, the pose parameters provided by pose recovery methods (for example the one provided with Microsoft's kinect) are the 3D cartesian coordinates of each joint, and these joints are different of those of MakeHuman. Another difficulty is that the pose parameters derived from the kinect are not always anatomically possible (strong asymmetry and variability of bones length, joint angles that are not anatomically possible, etc.) I spend some time writing code to convert the set of pose parameters of the kinect into the set of pose parameters of MakeHuman. My code works, but there remains some imperfections due to assumptions I made that are not completely true in practice. It turns out that the mapping between such two sets of pose parameters is an interesting, but quite complicated problem. So, I am wondering if there exist specialized libraries to do that.
(3) Last but not least, does it exist any library that can compute the motion of a human, with any morphological characteristics, walking along a given path. I saw a very nice demo at http://www.biomotionlab.ca/Demos/BMLwalker.html, but there is nothing I can reuse. I also discovered the nice work of Joëlle Tilmanne (http://tcts.fpms.ac.be/~tilmanne) but I found nothing I can use out-of-the-box in her webpage. I would like to get the motion for any shape of path, and to tune parameters such that the gender, the age, the weight, the height, etc. If possible, I would like also to be able simulate gait pathologies. The more realistic the gait is, the better it is.
To summarize, I would like to be able to do something like:
- Code: Select all
Avatar avatar; avatar.setAge (...); avatar.setHeight(...); avatar.setGender(...);
avatar.setPoseFromMocap (...);
Mesh mesh = blender.getMesh ();
and also:
- Code: Select all
Avatar avatar; avatar.setAge (...); avatar.setHeight(...); avatar.setGender(...);
avatar.setWalkingPath (...);
for ( unsigned int frame_id = 0 ; frame_id < ... ; ++ frame_id ) { Mesh mesh = avatar.getMesh ( frame_id ); ..... }