I searched forum for Kinect info and found none.
So would like to bring community attention to one interesting direction.
You know that there Kinect now can be used on PC, due to free drivers.
There is an example NiUserTracker which tracks skeleton
https://github.com/OpenNI/OpenNI/tree/m ... serTracker see demo at youtube http://www.youtube.com/watch?v=vI7iLmLDdoA how it looks like ( note kinect has not only depth camera but also true video camera )
now what comes to mind?
There is an approach to fit mesh to video:
http://openmesh.org/uploads/media/Hornung_TR10.pdf
but with depth information, which provides kinect and also already existing library of human features in MakeHuman, motion tracking in MakeHuman/Blender might become a real time desktop solution:
using depth camera it is possible to roughly fit MakeHuman character to scene human quite easily, then, when mesh is fitted to track using both video and depth information.
of cause I understand, that such task could take quite a bit of time to implement - still, being implemented - this will open motion capture for masses workflow: currently, as you may see on youtube video - tracking skeleton in Kinect in not precise, but using mesh fitting and also video hints - it might output quite good motion capture data.