Hi,
I'm trying to use MakeHuman to produce some experimental stimuli to study person perception. Hoping to do a perceptual adaptation experiment wherein body size can be adjusted from very skinny to very obese.
For this, I'm basically just making various identities and saving multiple copies, the first has every slider relating to body fat turned right the way down, and each new step has these sliders gradually increasing until there are around 20 versions of the body. I'd then just do a screen-grab of each body on a white background and use these to create the experiment.
I'm wondering if there's a more objective/precise way to go about this rather than just eyeballing the smallest slider movement I can for each new body-size step?
Some sliders will move from minimum to maximum, while others will move from 0-50% etc.
Not a huge issue if there's some variability in the intervals between the bodies, but it may limit what I can conclude experimentally to some extent.
Any input would be greatly appreciated