Yes, this is a frequent discussion. The basic summary is: it's not that easy.
First, there is no "statistical" model in the sense that it'd be based on a stochastic distribution of values. There is a 3d model which has been hand designed to look the way an artist thinks a human looks. That baseline is "base.obj" here:
https://github.com/makehumancommunity/m ... ata/3dobjsSecond, all transforms are completely deterministic. If you have a target that moves the vertex on the nose tip 0.1 along the Y axis, then it will always do exactly that. When that target is applied at 100% the vertex will be shifted 0.1 compared to a state where it is not applied. At 50% it will be shifted 0.05. This, and exactly this, will always happen,
independently on which other targets have already been applied.
The last part is important, and herein lies the problem for you. A target only cares about its own transforms and does not take into account which other transforms have already been made.
This is what makes "metric" values very, very tricky. Sure, you can make a target that expands the waist circumference with 20 centimeter on a baseline model by moving each relevant vertex around the waist a few units of distance along the XY plane. However, if a model has already had another target applied, such as the weight macro target or pregnant belly or something, then that is the baseline instead. The vertices will still shift exactly in the manner you specified, but the end result will probably be different than when you designed the target.
So, that's describing the problem. Is there a solution?
I think you're toast if you try to design a system that both needs to be completely predictable
and take arbitrary targets into account for your baseline. Maybe you could limit the scope to one or a few exact baseline(s) instead and work from there? Maybe one male and one female? If you know what the baseline is, then a comparison between that baseline and your xray images should be reasonably easy to convert into targets.