modeling parameters, dependencies and scale

If you have problems understanding something or getting started, ask here

Moderator: joepal

modeling parameters, dependencies and scale

Postby PinkLara333 » Mon Oct 02, 2017 3:39 pm

Hello, we are trying to use Makehuman in some research into body representation and I have a few questions which I could not find the answer to in the documentation or searching the forum, but many apologies if I have missed it as I may be using the wrong search terms. I am a complete beginner so sorry for silly questions.

First, I have written a python script that changes various body part measurements systematically according to values I import in from a .csv file, and saves .png files for each body. I can see a list of potential parameters I can change by calling MHScript.getModelingParameters(). But is there some documentation telling me which are independent and which are dependent, i.e. their hierarchical structure? I gather the ones labelled macrodetails are obviously changing many of the lower-order parameters, but do the ones beginning with 'measure' (e.g. 'measure/measure-thigh-circ-decr|incr') alter other body parts too or are they relatively independent? I am aiming to systematically vary a number of basic measurements (e.g. leg length, hip circumference, waist circumference) as independently as possible but unsure which parameters to use, out of the full list.

My other question regards the scale/ range of measurements. First, for each modeling parameter, is there a document telling me which range of values it accepts, e.g. 0 to 1 or -1 to 1? Second, what determines the two end-points (what the the extremes based on), and what is the midpoint (i.e. for a 0-1 scale, what does 0.5 reflect)? I am guessing if I set a value to 0.5 on a 0-1 scale, there should be no change of the model that I have loaded? Or does it adjust it to some pre-defined 'average' value? What decides how 'extreme' the ends of the scales are? I assume it was based originally on some population sample of body measurements, but it would be great to know if the extreme ends of possibilities of the 'measure' parameters reflect e.g. +-4 standard deviations from the mean of whatever sample was used, for example.

I hope my questions are clear and any help would be greatly appreciated.
PinkLara333
 
Posts: 7
Joined: Mon Oct 02, 2017 3:32 pm

Re: modeling parameters, dependencies and scale

Postby blindsaypatten » Mon Oct 02, 2017 4:56 pm

I'm sure you will get a more authoritative answer but here's my purely as a user understanding:
Outside of the parameters on the Macro tab, there are two types of sliders, ones that have a default value in the middle, and ones that have a default value on the left. The head shape sliders are the only one I can think of that have the default on the left. Where the default is in the middle there are two targets with neither applied when the slider is in the middle default position, when the slider is moved to the left of center one target is applied, when it is moved to the right of center the other is applied. Center has a value of 0, full left has a value of -1, full right has a value of 1.
The sliders are all independent in the sense that they each have their own target (or pair of targets), outside of the Macro tab none of the sliders operate by affecting the values of other sliders.
The Macro tab is a special case where which targets are used depends on the race, gender, and age ranges. I won't go into details as it doesn't sound like you will be working with those.
Outside of the Macro tab, head shape is the only set of sliders that I can think of that affect multiple things at once, and that is still restricted to head shape. The measurement tab sliders seem to be quite independent of one another.
All of the sliders either go from -1 to 0 to 1 or just 0 to 1. In Blender you can apply arbitrary values, and likely you can do the same with the MakeHuman API.
I haven't seen anything to indicate that the ranges are rigorously based on empirical data, but who knows.
blindsaypatten
 
Posts: 586
Joined: Tue Mar 14, 2017 11:16 pm

Re: modeling parameters, dependencies and scale

Postby joepal » Mon Oct 02, 2017 8:31 pm

First: Drop all parallels and assocations between targets and real world measurements. There are no such connections, other than largely accidental. Thus thinking along the tracks of "population sample" and "standard deviations" will likely lead you astray.

The range is from "morph not applied at all" to "morph fully applied". For a longer discussion of what a "morph" is here, see the documentation on targets, http://www.makehumancommunity.org/wiki/ ... _target%3F and http://www.makehumancommunity.org/wiki/ ... MakeTarget

There are two different slider scales: -1 to +1, and 0 to +1. In the first case there are two opposing targets (usually named "...-decr" and "...-incr"), one which is applied when going below zero, and one which is applied when going above zero. In the second case there is only one target, which is applied when going above zero.

A slider value of 0.5 (or -0.5) always means that a target is 50% applied. I.e the vertices have translated 50% of the distance to their target destination.

Which targets are applied by a slider is defined in the JSON files found here: https://bitbucket.org/MakeHuman/makehum ... at=default

The available targets are available here: https://bitbucket.org/MakeHuman/makehum ... at=default
Joel Palmius (LinkedIn)
MakeHuman Infrastructure Manager
http://www.palmius.com/joel
joepal
 
Posts: 4469
Joined: Wed Jun 04, 2008 11:20 am

Re: modeling parameters, dependencies and scale

Postby PinkLara333 » Tue Oct 03, 2017 4:01 pm

Thank you both, this is exactly the information that I needed. I have one final question; I now understand that a direct link between the MakeHuman modifications and real life are not clear, but I think I can at least get some feeling of how the modifications map on to real life measurements and then using an anthropometric database, I can get a rough idea of how 'extreme' the ends of the scales are. To do this I will be looking at the info in the Measure tab to get dimensions in cm. Is there any way of getting widths rather than circumferences? My anthropometric data are all 2D widths unfortunately! Many thanks again.
PinkLara333
 
Posts: 7
Joined: Mon Oct 02, 2017 3:32 pm

Re: modeling parameters, dependencies and scale

Postby joepal » Tue Oct 03, 2017 4:43 pm

PinkLara333 wrote:Thank you both, this is exactly the information that I needed. I have one final question; I now understand that a direct link between the MakeHuman modifications and real life are not clear, but I think I can at least get some feeling of how the modifications map on to real life measurements and then using an anthropometric database, I can get a rough idea of how 'extreme' the ends of the scales are. To do this I will be looking at the info in the Measure tab to get dimensions in cm. Is there any way of getting widths rather than circumferences? My anthropometric data are all 2D widths unfortunately! Many thanks again.


My suggestion would be writing a makehuman plugin for calculating the distance between two given vertex positions. Since all vertex positions are available in a simple two-dimensional array (vertex number / xyz coordinate), you can get the coordinates and then calculate the distance once you have the pertinent vertex numbers.

If you use MHAPI (see https://github.com/makehumancommunity/c ... gins-mhapi), you can get the array of vertex coordinates by:

Code: Select all
from core import G
api = G.app.mhapi
allVertexCoordinatesInOneBigArray = api.mesh.getVertexCoordinates()


To figure out the number of a particular vertex, open "base.obj" (found here https://bitbucket.org/MakeHuman/makehum ... at=default) in blender (it's a wavefront obj, so you "import" it). Then enter edit mode, select the desired vertex, write down its xyz coordinate, then search for the same coordinate in the base.obj file (wavefront files are simple text files). Take note of the line number where you find the coordinate. Deduct the number of lines that are comment lines in the beginning of the file. The remaining number is the vertex number.

The distance will obviously be in internal units, but the conversion to centimeters is basically a selection of what scale you assign.
Joel Palmius (LinkedIn)
MakeHuman Infrastructure Manager
http://www.palmius.com/joel
joepal
 
Posts: 4469
Joined: Wed Jun 04, 2008 11:20 am

Re: modeling parameters, dependencies and scale

Postby blindsaypatten » Tue Oct 03, 2017 8:31 pm

In Blender you can select a vertex (or set of vertices) and then press space and type in print vertex numbers.
blindsaypatten
 
Posts: 586
Joined: Tue Mar 14, 2017 11:16 pm

Re: modeling parameters, dependencies and scale

Postby joepal » Wed Oct 04, 2017 8:08 am

blindsaypatten wrote:In Blender you can select a vertex (or set of vertices) and then press space and type in print vertex numbers.


Thanks, didn't know that. That should be a lot easier.
Joel Palmius (LinkedIn)
MakeHuman Infrastructure Manager
http://www.palmius.com/joel
joepal
 
Posts: 4469
Joined: Wed Jun 04, 2008 11:20 am

Re: modeling parameters, dependencies and scale

Postby PinkLara333 » Mon Oct 09, 2017 11:40 am

Thank you all for your help! I will try to work the distances out in Blender as you advise. Thanks again!
PinkLara333
 
Posts: 7
Joined: Mon Oct 02, 2017 3:32 pm


Return to Newbies

Who is online

Users browsing this forum: No registered users and 1 guest

cron