Difference between revisions of "Documentation:Scratch"

From MakeHuman Community Wiki
Jump to: navigation, search
 
Line 1: Line 1:
 
 
=== GUI languages and translations. ===
 
Our GUI is available in many languages, but translations are not yet complete.
 
Anyway now contributing in order to add a new language is very easy, since the MakeHuman project is now available for translation on!LINK!http://www.transifex.com/organization/makehuman/ -- Transifex!/LINK!.
 
!IMAGE!Pictures/languages.png!/IMAGE!
 
Transifex is a web application for localization in an easy and agile way.
 
!IMAGE!Pictures/transifex.png!/IMAGE!
 
If you want to help the MakeHuman project by translating the GUI into your language, you first need to create an account on Transifex. Then, when logged in, you can go to!LINK!https://www.transifex.com/projects/p/makehuman/ -- the MakeHuman Transifex page!/LINK!and click on the appropriate language you want to translate. Then click the "Join team" button. Now you can click the current release name (for example  "Alpha 8") entry and click "Translate now".
 
If you want to make a translation in another language that is not yet listed, click "request language" on the!LINK!https://www.transifex.com/projects/p/makehuman/ -- MakeHuman transifex page!/LINK!. We will make sure to accept it as fast as possible, so you can start translating.
 
Translating is quick and easy. You can select the "Untranslated strings" filter to show only the things left to translate. Click the first word on the left, and in the center of the screen enter your translation in the input box. When done typing, simply press the TAB button on your keyboard, and it automatically goes to the next entry. Repeat this process untill everything is translated.
 
 
 
You can leave things open you are uncertain about, and leave them for later. Perhaps others know a good solution. You can interrupt your work at any time and continue working later, or leave it for others to finish.
 
If you already have a translated file on your hard disk (for example you made modifications to an already existing language .ini file of MakeHuman, or you have filled in a .missing language file), you can upload it as translation and Transifex will automatically include it in the translation.
 
You will be also able to download the json file of your language, in order to put it in makehuman/data/languages. On restarting MakeHuman, the new language will be available as an option under "Settings".
 
 
 
 
 
=== MHBlenderTools: MakeWalk basic workflow ===
 
MakeWalk is a Blender add-on for retargeting mocap data (.bvh files) to a given armature.
 
 
==== Retargeting: how it works ====
 
 
The goal of retargeting is to transfer a motion from a source armature (e.g. from a BVH file) to a given target armature (e.g. the MHX rig). However, it is not straightforward to assign the source action to the target rig, even if the bones have identical names. The motion of each bone is specified in local coordinates, relative to the parent and the bone's own rest pose. If the rest poses of the source and target armatures differ, the source F-curves can not be used directly by the target armature.
 
!IMAGE!Pictures/makewalk-1.png!/IMAGE!
 
The picture above shows a transformation in the local coordinate system. Since the parent's local Y points along its axis and its local Z points up, the child bone is rotated around the local X axis. This is not very useful if the target armature has a diffent rest pose. To retarget the pose, we therefore reexpress the transformation in the global coordinate system, as shown below. The local X rotation corresponds to a global Y rotation, and by a different angle. Once the global transformation matrix is known, we can reexpress it in the target bone's local coordinate system.
 
 
 
The retargeting process thus consists of making two coordinate transformations:
 
Source local => Global => Target local.
 
This will ensure that the source bone and the target bone will have the same global orientation.
 
!IMAGE!Pictures/makewalk_2.png!/IMAGE!
 
Unfortunately, things are a little more complicated. We do not always want the source and target bones to have identical orientation. In particular, the root or pelvis bone may point in entirely different directions in different  armatures. E.g. in the CMU armature rest pose the pelvis points forward-down, and in the MHX rig it points straight up.
 
!IMAGE!Pictures/makewalk-3.png!/IMAGE!
 
If we insisted that the root bone in the MHX rig would point in the same direction as in the CMU rig, the retargeting would not be very successful, as shown in the figure above. If we instead keep the rotation offset from the rest poses, the target pose becomes much better, as shown below.
 
To calibrate the source and target armature against each other, MalkWalk introduces extra keyframes at frame 0, where both armatures are posed in T-pose.
 
!IMAGE!Pictures/mcp-ret-060-calibrate.png!/IMAGE!
 
In the rest of the animation, bones in the target armature copy the global rotations of the source armature, apart from differences present in the T-poses. In this way we can transfer animations from CMU to the MHX rig, despite the fact that the rest poses are very different.
 
 
 
==== Basic Workflow ====
 
 
=== Retargeting ===
 
The MakeWalk panels appear in the tool shelf whenever an armature is the active object. Select the armature and press the Load And Retarget button. In the file selector, select the .bvh file. We choose the file 90_04.bvh from the CMU database. It is a cartwheel animation.
 
!IMAGE!Pictures/makewalk-4.png!/IMAGE!
 
After a short wait, the armature is doing gymnastics.
 
!IMAGE!Pictures/makewalk-5.png!/IMAGE!
 
At frame 0 of the animation the armature has been placed in T-pose. This is not part of the originial .bvh file, but inserted by MakeWalk to calibrate the source armature (defined by the bvh file) and the target armature (the selected armature in the viewport) against each other.
 
=== Supported armatures ===
 
MakeWalk works with most straightforward biped rigs with FK arms and legs, such as the Rigify meta-rig. There is also built-in support for some more complex rigs: the MHX advanced rig from MakeHuman, the MHX rig from MakeHuman and Rigify.
 
!IMAGE!Pictures/makewalk-6.png!/IMAGE!
 
It is often possible to use MakeWalk with other complex rig, but in that case the automatic bone identification may fail. If so, a bone map must be defined manually, see!LINK!http://makehuman.org/doc/node/defining_the_target_rig_manually.html -- Defining a Target Rig Manually!/LINK!.
 
=== Troubleshooting ===
 
Retargeting is a rather involved subject, and it can sometimes result in poor motion. The process may even fail completely, usually because MakeWalk failed to automatically identify the bones of a complex rig. If this should happen, see!LINK!http://www.makehuman.org/doc/node/makewalk_errors_and_corrective_actions.html -- Errors and Corrective Actions!/LINK!.
 
 
==== Where to find BVH files ====
 
 
There are several different formats that mocap files can be stored in. MakeHuman's mocap tool can only deal with files in Biovision BVH format. BVH files can be bought from many commercial sources, but a large range of mocap files are also available for free download. Here are some sites I found useful.
 
* CMU Graphics Lab Motion Capture Database: Hosted at Carnegie-Mellon University, this is a huge library of mocap files which can be downloaded for free. The web address is!LINK!http://mocap.cs.cmu.edu/ -- http://mocap.cs.cmu.edu!/LINK!.  CMU hosts mocap files in three formats: tvd, c3d and amc. However, the mocap tool can only read BVH files, so none of these files can be used directly. Fortunately, B. Hahne at!LINK!http://www.cgspeed.com/ -- www.cgspeed.com!/LINK!has converted the CMU files to BVH. The converted files are located at!LINK!http://sites.google.com/a/cgspeed.com/cgspeed/motion-capture -- http://sites.google.com/a/cgspeed.com/cgspeed/motion-capture!/LINK!.
 
 
 
* Advanced Computing Center for the Arts and Design (ACCAD): Hosted at the Ohio State University, this is another great source of free mocap files. BVH files can be downloaded from!LINK!http://accad.osu.edu/research/mocap/mocap_data.htm -- http://accad.osu.edu/research/mocap/mocap_data.htm!/LINK!
 
* Eyes Japan (mocapdata.com):This is a Japanese company that sells mocap data commercially, but they also offer a huge number of motions for free. According to their homepage, mocapdata.com provides 744 premium motion data and 4197 free motion data. The only catch is that downloading requires registration. Not surprisingly, the homepage of mocapdata.com has the address!LINK!http://www.mocapdata.com/ -- http://www.mocapdata.com/!/LINK!.
 
 
 
* The Trailer's Park: Free mocap data can also be found at the Trailer's Park,!LINK!http://www.thetrailerspark.com/ -- http://www.thetrailerspark.com!/LINK!. This site does not offer original data, but offer repacks of mocap data from other free sites for download. Free download is limited to some five packs per day, so some patience is required here.
 
 
 
* Hochschule der Medien, Universität Bonn (HDM):!LINK!http://www.mpi-inf.mpg.de/resources/HDM05/ -- http://www.mpi-inf.mpg.de/resources/HDM05!/LINK!
 
* The Perfume global site project #001:!LINK!http://perfume-dev.github.com/ -- http://perfume-dev.github.com/!/LINK!
 
 
 
=== MHBlenderTools: MakeWalk user interface ===
 
 
==== The user interface ====
 
 
The user > interface of MakeWalk is located in under the Armature tab, and becomes visible when an armature is selected. It consists of six panels; the first one is open by default and the others are closed.
 
!IMAGE!Pictures/makewalk-7_0.png!/IMAGE!
 
* Load And Retarget: Select a BVH file and retarget it to the active armature.
 
* Start Frame: The first frame in the BVH file to considered.
 
* Last Frame: The last frame to be considered, unless the animation stops earlier. The difference last_frame - first_frame is the maximal number of frames after retargeting. The number of frames in the BVH file may be larger, if some frames are skipped due to subsampling
 
* Detailed steps: When this options is selected, further buttons are show below
 
* Load BVH File (.bvh). Load a BVH file, and create an animated armature from it.
 
* Rename And Rescale BVH Rig. With the BVH armature active, and a target armature selected, rename and rescale the bones of the active armature to fit the target.
 
* Load And Rename BVH File (.bvh). A combination of the previous two buttons. With a target armature active, load a BVH file, and create an animated armature with renamed and rescaled bones.
 
* Retarget Selected To Active. Retarget the animation from a renamed and rescaled BVH armature to the active armature.
 
* Simplify FCurves. Simplify the F-curves of the active armature.
 
* Rescale FCurves. Rescale F-curevs of the active armature.What if retargeting fails?
 
MakeWalk is designed to retarget animations to a given armature with a minimum of user intervention. However, retargeting is a complex process, and entirely automatic retargeting may fail or result in suboptimal motion. Information about how to identify and correct problems is found in!LINK!/doc/node/blendertools_makewalk_troubleshooting.html -- Errors and Corrective Actions!/LINK!.
 
A common problem is that automatic identification of bones in the target armature fails. A bone map can then be assigned manually, cf.!LINK!/doc/node/blendertools_makewalk_troubleshooting.html -- Defining the Target Rig Manually.!/LINK!
 
 
==== Options panel ====
 
 
!IMAGE!Pictures/makewalk-8.png!/IMAGE!
 
* Use Default Subsample. Blender normally plays the animation in 24 fps or 25 fps, but the animation in the BVH file may be recorded at a different speed. In particular, the BVH files from CMU were filmed at 120 fps. Enable this option to have the animation play at natural speed, irrespective of the frame rate in the BVH file. Other subsample options are described below.
 
* Auto scale. Set the scale automatically based on the size of the left thigh. This choice has two motivations:
 
* Almost all character do have a left leg.
 
* The leg size is crucial for making walk cycles look good.
 
 
* Scale. The default MakeHuman scale is decimeters - 1 unit = 1 decimeter. Translations in a BVH file are expressed in different units; often the base unit is inches, meters or centimeters, but more obscure units can also occur, e.g. in BVH files from CMU. If the scale is set incorrectly, rotations will still be correctly retargeted, but the character will appear to take giant leaps or miniscule steps.
 
* Use Limits: If this option is enabled, MakeWalk honors any Limit Rotation constraints, and will not allow excessive rotations. If the animation in the bvh files exceeds some rotation limits, this makes the retargeted animation less faithful. On the other hand, the rig may not be built for excessive rotations, so unchecking this option can lead to other problems.
 
* Unlock Rotation: If this option is disabled, MakeWalk honors any rotation locks. If the animation in the bvh files bend around locked axes, this makes the retargeted animation less faithful. If Unlock Rotation is enabled, any X or Z rotation locks are disabled. Y rotation locks (bone twisting) are never disabled. The reason for this is that in the MHX and Rigify rigs, forearm rotation is handled by deform bones controlled by hand twisting.
 
* Auto source rig. The source rig (i.e. the armature defined by the BVH file) is specified in the!LINK!/doc/node/blendertools_makewalk_source_and_target_armature.html -- Source Armature panel!/LINK!. Enable this option if the mocap tool should attempt to automatically identify the source rig, based on the structure of the bone hierarchy.
 
* Auto target rig. The target rig (i.e. the armature in the blend file) is specified in the!LINK!/doc/node/blendertools_makewalk_source_and_target_armature.html -- Target Armature panel!/LINK!. Enable this option if the mocap tool should attempt to automatically identify the target rig, based on the structure of the bone hierarchy.
 
* Ignore Hidden Layers. Ignore bones on hidden layers when identifying the target rig.
 
=== Subsample and Rescale ===
 
If the Use Default Subsample option is set, the mocap tool will rescale the animation to fit the current frame rate. However, there are at least two reasons why you may want to load an animation at a different frame rate:
 
* * To obtain a slow-motion or rapid-motion effect.
 
* To quickly load an animation to see if the gross features will work out.
 
If the Use Default Subsample option is disabled, the SubSample section becomes visible.
 
* Subsample. Enable subsampling.
 
* Subsample Factor. If the value of this property is n, only every n:th frame of the BVH animation is loaded.
 
* Rescale. Enable rescaling.
 
* Rescale Factor. If the value of this property is n, the time distance between keyframes is changed to n.
 
* Rescale FCurves. Apply the settings above to existing F-curves rather than to the loaded animation.
 
Rescaling differs from simply scaling F-curves in the F-curve editor.
 
=== Simplification ===
 
* Simplify FCurves. Remove unnecessary keyframes.
 
* Max Loc Error. The maximal allowed error for location keyframes, in Blender units. A larger error results in fewer keyframes but less accuracy.
 
* Max Rot Error. The maximal allowed error for rotation keyframes, in degrees. A larger error results in fewer keyframes but less accuracy.
 
* Only Visible. Simplification only affect F-curves visible in the Graph editor.
 
* Only Between Markers. Simplification only affects F-curves between the two outermost selected markers. The timeline must have at least  two selected markers.
 
 
==== Edit panel ====
 
 
Loading and retargeting is normally only the first step in the creation of an animation from mocap data. There are many reasons why a loaded animation does not behave exactly the way you want it to: artifacts in the mocap data, differences in armature structure not compensated for correctly by the retargeting process, differences in body stature between the mocap actor and the target character, or simply that the filmed sequence does not do exactly what you intend.. It is of course possible to edit the action directly in the graph editor, but this is unpractical due to the amount of mocap data. The mocap tool offers several possibilities to edit an action at a higher level. These tools are colleted in the Edit Action panel which is located just below the Options panel.
 
!IMAGE!Pictures/edit-action.png!/IMAGE!
 
=== Inverse Kinematics ===
 
* Transfer FK => IK: The load and retarget steps transfers an animation from a bvh file to the target character. However, only the FK bones are animated. Press this button to transfer the FK animation to the IK bones. Only works for the advanced MHX armature. If two markers are selected, only the animation between the markers is transferred.
 
* Transfer IK => FK: Transfer the animation back from the IK bones to the FK bones. Useful if the IK animation has been edited,
 
* Clear IK Animation: Remove all keyframes from all IK bones (arms and legs).
 
* Clear FK Animation: Remove all keyframes from all FK bones (arms and legs).
 
=== Global Edit ===
 
* Shift Animation. Shift the keys for the selected bones at all keyframes.If two markers are selected, only the keyframes between the markers are deleted.
 
* X,Y,Z: F-curves affected by the next button.
 
* Fixate Bone Locations:Replace all location keys by their average. Only selected bones and keyframes between selected markers are affected.
 
* Rescale Factor: Factor used by next button.
 
* Rescale FCurves: Rescale all F-curves by the factor above. This is similar to scaling F-curves in the curve editor, but jumps are treated correctly. E.g., rotations of +180 degrees and -180 degrees are the same, but if we scale an F-curve with a factor two, the intermedate keyframe will have the average rotation 0 degrees, The Rescale FCurves button handles this case correctly.
 
=== Local Edit ===
 
This section could be called "Poor man's animation layers". A loaded mocap animation usually has imperfections that must be edited, but without changing the overall feel of the motion. The Start Edit button creates a new animation layer where differences from the original motion are stored as keys, called delta keys since delta often denotes a difference.
 
* Start Edit: Start editing F-curves.
 
* Undo Edit: Quit F-curve editing, without modifying the original F-curves.
 
* Loc: Set a location delta keyframe.
 
* Rot: Set a rotation delta keyframe.
 
* LocRot: Set a location and rotation delta keyframe.
 
* Delete: Remove all delta keyframes at the current time.
 
* |<: Move to first delta keyframe
 
* <: Move to previous delta keyframe.
 
* >: Move to next delta keyframe.
 
* >|: Move to last delta keyframe.
 
* Confirm Edit: Modify the original F-curves and quit F-curve editing.
 
The delta keys are represented by markers in the timeline.
 
!IMAGE!Pictures/mwe-315-local-keys.png!/IMAGE!
 
A delta key can be added with the Loc, Rot and LocRot buttons, and removed with Delete. There is no way to view the delta keys directly. In the viewport and the curve editor, the final pose is shown, which is the sum of the original pose and the delta key.
 
A common use for delta keys is to correct for intersection with other objects or the character herself. The typical workflow is as follows:
 
* * Start Edit.
 
* Set a delta key at a good frame just before the intersection.
 
* Set a delta key at a good frame just after the intersection.
 
* Edit the pose a the frame(s) where intersection occurs.
 
* If the intersection has been removed, Confirm Edit. If not, set new delta keys until it has, or Undo Edit to remove the delta layer.
 
=== Feet ===
 
* Left: Affect the left foot.
 
* Right: Affect the right foot.
 
* Hips: Affect the characters hip (COM) bone.
 
* Offset Toes: Ensure that the toe is below the ball of the foot at all keyframes. Primarily useful for rigs with a reverse foot setup as explained below.
 
* Keep Feet Above Floor: If a mesh object (typically a plane) is selected, shift the keyframes to keep the affected feet above the plane. The plane does not necessarily lie in the XY plane; if the plane is tilted, the feet are kept on the plane's upper side. If no plane is selected, the feet are kept above the XY plane (z = 0). The IK feet are affected if the rig has and uses IK legs, otherwise the FK feet are kept above the floor. If two markers are selected, only the keyframes inbetween are shifted.
 
In a rig with a reverse foot setup, such as the MHX rig, the foot can rotate around the toe, ball, and heel. The reverse foot and toe bones are completely fixed by the corresponding FK bones, but the IK effector can be placed arbitrarily, as long as it ends at the toe tip. The transfer tool uses this freedom to make the IK effector perfectly horizontal, provided that the toe is below the ball and heel.
 
!IMAGE!Pictures/refoot.png!/IMAGE!
 
 
 
To use this feature we must ensure that the toe is below the ball of the foot, which is done by the Offset Toes button.
 
=== Loop And Repeat ===
 
!LINK! -- Loop Animation!/LINK!
 
Create a loop of the action between two selected time markers, by blending the keyframes in the beginning and end of the loop. This is useful e.g. to create walk and run cycles for games. For good results, the poses at the beginning and end of the selected region should be similar.
 
* Blend Range: The number of keyframes used for blending.
 
* Loop in place: Remove the X and Y components of the root bone's location.
 
* Loop F-curves: Loop the animation.
 
!LINK! -- Repeat Animation!/LINK!
 
 
 
Repeat the action between two selected time markers. The actions should preferably be looped before it is repeated, to make the beginning and end match seamlessly.
 
* Repeat Number: The number of repetitions.
 
* Repeat F-curves: Repeat the animation.
 
=== Stitching ===
 
Create a new action by stitching two actions together seamlessly.
 
* Update Action List: Update the first and second action drop-down lists.
 
* First Action: The name of the first action.
 
* First End Frame: Last frame of the first action
 
* Set Current Action: Set the first action as the current action.
 
* Second Action: The name of the second action.
 
* Second Start Frame: First frame of the second action.
 
* Set Current Action: Set the second action as the current action.
 
* Action Target: Choose between creating a new action and prepending the second action.
 
* Blend Range: The number of keyframes used for blending. Same parameter as in Loop Animations section.
 
* Output Action Name:
 
* Stitch Actions: Stitch the actions together.
 
 
 
=== MHBlenderTools: MakeWalk armatures ===
 
 
==== Source Armature panel ====
 
 
MakeWalk transfers an animation from a source armature, defined in a bvh file, to a given target armature. It uses an intermediate standard rig described in!LINK!http://makehuman.org/doc/node/defining_the_target_rig_manually.html -- Defining the Target Rig Manually!/LINK!. The bone map from the source armature to the target armature hence consists of two parts:
 
* A map from the source rig to the standard rig. It is defined in the MakeWalk: Source Armature panel.
 
* A map from the target rig to the standard rig. It is defined in the!LINK!http://makehuman.org/doc/node/makewalk_target_armature_panel.html -- MakeWalk: Target Armature panel!/LINK!.
 
!IMAGE!Pictures/mws-010-panel.png!/IMAGE!
 
When a new scene is opened, the  panel consists of the single button Initialize Source Panel. Once this button has been pressed, the following content is available:
 
!IMAGE!Pictures/mws-020-auto.png!/IMAGE!
 
* Reinit Source Panel: Reinitialization.
 
* Auto Source Rig: If this option is enabled, MakeWalk will try to identify the source rig automatically. It may happen that MakeWalk fails to identify the source rig automatically, but this is very unusual. If it should nevertheless happen, it is possible to define the bone map manually in analogy with !LINK!http://makehuman.org/doc/node/defining_the_target_rig_manually.html -- how it is done for target rigs!/LINK!.
 
* Source rig. A list of bvh rigs recognized by the mocap tool. This either defines the expected source rig (if Auto Source Rig is disabled) or to Automatic.
 
* Bones in the active source rig.
 
 
==== Target Armature panel ====
 
 
The second part of the mapping from source to target armatures is defined by the panel labelled MH Mocap: Target armature. It is the top-most of the mocap tool panels, and is closed by default.
 
When a new scene is opened, the  panel consists of the single button Initialize Target Panel. Once this button has been pressed, the following content is available:
 
!IMAGE!Pictures/mwt-011-panel.png!/IMAGE!
 
* Reinit Target Panel. Reinitialization.
 
* Target rig. A list of bvh rigs recognized by the mocap tool. This either defines the expected Target rig (if Auto Target rig guessing is disabled), or is set to a matching rig (if automatic target rig identificiation is enabled).
 
* Auto Target Rig. If this option is enabled, MakeWalk will try to identify the target rig automatically. However, automatic rig identification is not trivial for complex rigs, and it may fail. If so, the bone map may be specified manually, cf!LINK!http://makehuman.org/doc/node/defining_the_target_rig_manually.html -- Defining the Target Rig Manually!/LINK!. If the bone map is defined. The target rigs available by default correspond mostly to the rig presets that can be exported from MakeHuman
 
* MHX. An advanced rig from MakeHuman alpha 8.
 
* MH Alpha 7. The MHX rig from MakeHuman alpha 7.
 
* Rigify.
 
 
* Ignore Hidden Layers: Ignore bones on hidden layers during automatic rig identification.
 
* Reverse Hip. Select this option if the armature has an reverse hip. It is rather common that an armature has a reverse hip. In a normal hip setup, the armature root is the hip or pelvis bone, and the thighs and the rest of the spine are children of this bone. In a reverse hip setup, the first bone in the spine has been reversed. There is a separate root bone, and the two lowest bones in the spine are both children of thise root, whereas the thighs are children of the reversed hip.
 
* Identify Target Rig: Identify the target rig, i.e. find out how bone names in the active armature correspond to the internal names. This step is performed automatically during retargeting, but the identification can also be done separately for debugging purposes. The bone map appears in the area called FK bones below.
 
* Set T-pose. Pose the active armature in T-pose.
 
* Save T-pose. Option used by the next button.
 
* Save Target File. Save the current bone map as a .trg file. If the Save T-pose option is set, also save a json file defining the T-pose.
 
* FK bones. The bone map.
 
The picture below shows automatic rig identification of the Rigify meta-rig (Add > Armature > Advanced Human).
 
!IMAGE!Pictures/mwt-020-metarig.png!/IMAGE!
 
 
 
 
=== MHBlenderTools: MakeWalk troubleshooting. ===
 
 
==== What if retargeting fails? ====
 
 
=== Errors and Corrective Actions ===
 
This document will describe common errors and corrective actions.
 
It may happen that MakeWalk fails to retarget an animation to a given armature. In that case an error message is displayed.
 
!IMAGE!Pictures/mwa-100-error.png!/IMAGE!
 
The error message consists of the following:
 
Mocap error
 
 
 
Category
 
 
 
Detailed error message
 
 
 
A link to this page
 
MakeWalk errors are grouped into the following categories:
 
* Load Bvh File
 
* Rename And Rescale
 
* Identify Target Rig
 
* Automatic Target Rig
 
* Manual Target Rig
 
* Identify Source Rig
 
* Retarget
 
* General Error
 
Load Bvh File
 
 
 
Rename And Rescale
 
 
 
Identify Target Rig
 
=== Automatic Target Rig ===
 
The most common problem is probably that MakeWalk fails to identify the target rig automatically. There are several possible reasons for this:
 
* The character is not oriented correctly. In the rest pose, the character should be standing with up being the positive Z axis and facing -Y.
 
* The armature is complex with extra bones not corresponding to a standard biped rig.
 
* The armature only has IK arms or legs. MakeWalk retargets animations to the FK limbs, so if no such bones exist, the program will not work.
 
!IMAGE!Pictures/mwa-110-reverse-hip.png!/IMAGE!
 
It is rather common that an armature has a reverse hip. In a normal hip setup, the armature root is the hip or pelvis bone, and the thighs and the rest of the spine are children of this bone. In a reverse hip setup, the first bone in the spine has been reversed. There is a separate root bone, and the two lowest bones in the spine are both children of thise root, whereas the thighs are children of the reversed hip.
 
!IMAGE!Pictures/mwa-120-reversehip.png!/IMAGE!
 
The advantage of such a setup is that the upper and lower body can be posed independently. However, MakeWalk failes to identify the bones, unless the Reverse Hip option has been enabled.
 
If automatic bone identification still fails, bone mapping has to be made manually. How this is done is described in the next section.
 
=== Defining the Target Rig Manually ===
 
Internally, MakeWalk retargets animations to an armature with the following bone hierarchy.
 
Here is a visual illustration of the bone hierarchy:
 
!IMAGE!Pictures/mwa-010-armature.png!/IMAGE!
 
In order to retarget to an armature with different bone names, we must define a map between the given bones and the internal names. By default, MakeWalk attempts to do this automatically. However, automatic bone mapping may easily be confused for non-trivial rigs. If this happens, one can define the bone map manually.
 
A bone map for a target armature is defined by a .trg file located in the target_rigs folder under the makewalk directory. The folder already contains three files, for retargeting the MHX advanced rig, the MakeHuman, and Rigify. These rigs are too complicated to identify the bone map automatically, so MakeWalk recognizes these rigs and use the predefined bone map.
 
Create a .trg file using an existing file as a template. E.g., a .trg file could look like this:
 
Note that it is not necessary to define maps to all bones. Bone names must not contain spaces, since whitespace is used as a delimiter in the .trg file. If the bones in your armature do contain spaces, replace them by underscore ( _ ). MakeWalk treats space and underscore as equivalent, so this is not a problem, except for very strange naming convention.
 
!IMAGE!Pictures/mwa-020-myrig.png!/IMAGE!
 
Save the .trg file with the name my_rig.trg in the target_rigs folder and press the Reinit Target Panel button.  My_Rig should now appear in the Target rig list. Select it. In the FK bones sections, the My_Rig bone names are now listed. Make sure that the Auto Target Rig option is deselected, to override automatic bone mapping. Finally go to the main panel and press Load And Retarget. The animation should now be loaded.
 
 
 
=== MHBlenderTools: MakeWalk utilities ===
 
 
==== Utilities panel ====
 
 
This panel contains material that does not naturally fit into the other panels.
 
!IMAGE!Pictures/mwu-010-panel.png!/IMAGE!
 
=== Default Settings ===
 
* Save Defaults: Save current settings as the default settings.
 
* Load Defaults: Load the default settings from file.
 
=== Manage Actions ===
 
* Actions: A list of all actions in the scene, at the time when the Update Action List button was last pressed.
 
* Filter: If selected, only actions belonging to the active character are included in the action list. When the mocap tool creates an action, the first four letters in the action name are taken from the rig name.
 
* Update Action List:
 
* Set Current Action: Set the action selected in the Actions list as the active action.
 
* Delete Action: Permanently delete the action selected in the Actions list. The action must have zero users. It is quite cumbersome to permanently delete actions in Blender. The reason is that creating an action with hand animation takes much work, which should not be lost accidentally. The situation is different with mocap, where it is easy to fill up a blend file with many irrelevant actions. This button makes it easier to clean out such junk motions.
 
* Delete Temporary Actions: Some tools create temporary actions, whose names start with a hash sign (#). Deletes all such actions.
 
=== Temporary properties ===
 
* Delete Temporary Properties. MakeWalk creates some properties for relevant posebones during retargeting. Pressing this button removes these properties. However, be aware that some of the tools in the Edit panel may fail if the temporrary properties are deleted.
 
The temporary properties for the active posebone can be inspected in the N-panel.
 
!IMAGE!Pictures/mwu-030-temp-props.png!/IMAGE!
 
* McpBone: The name of this bone in the internal hierarchy.
 
* McpParent: The parent of this bone in the internal hierarchy.
 
* McpQuatW, McpQuatX, McpQuatY, McpQuatZ: The rotation of this bone in T-pose, represented as a quaternion.
 
=== T-pose ===
 
* Set T-pose: Set the current pose to T-pose.
 
* Clear T-pose: Set the current pose to the default pose.
 
* Load T-pose: Load a T-pose from a .json file to the active armature.
 
* Save T-pose: Save the current pose as a .json file.
 
=== Rest Pose ===
 
* Current Pose => Rest Pose: Set the current pose to rest pose.
 
 
 
=== MHBlenderTools: MHX importer basic usage. ===
 
MHX (MakeHuman eXchange format) is a custom format used to transfer a rigged character from MakeHuman to Blender. Due to its tight integration with Blender's python API, it supports advanced features like constraints, drivers and properties that general-purpose formats like Collada or FBX do not support.
 
 
==== Export From MakeHuman ====
 
 
!IMAGE!Pictures/mx010-export.png!/IMAGE!
 
After the character has been designed in MakeHuman, go to the Files > Export tab and select Blender exchange (mhx) as the Mesh format. Type the character's name in the box at the top and press export. An mhx file has been exported.
 
 
==== Import Into Blender ====
 
 
Once the mhx importer has been enabled, we can import the mhx file. Select File > Import > MakeHuman (.mhx).
 
!IMAGE!Pictures/mx070-import.png!/IMAGE!
 
In the file selector, navigate to the file that we just exported from MakeHuman, i.e. John Doe.mhx in the export directory. After a short while, the character appears in the viewport, ready for posing.
 
!IMAGE!Pictures/mx080-import.png!/IMAGE!
 
 
 
=== MHBlenderTools: MHX default rigging ===
 
If pose/animate > skeleton is set to 'None' when the character is exported from MakeHuman in .MHX format, the character is equipped with the default MHX armature. This is a rather complex rig with quite a few features.
 
The bones are grouped into bone layers. Bone layers can be turn on and off in the MHX Main panel that appears in the user interface pane (N-pane), which is found to the right of the viewport. N-pane visibility can be toggled on and off with N-key.
 
The Root Layer
 
!IMAGE!Pictures/mx210-root.png!/IMAGE!
 
There are two bones on the Root layer:
 
* master: the big bone located at the foot level. It is the ultimate parent of every other bone in the rig, and can be used to move the entire character, including ik effectors.
 
* root: the wiggly bone located at the character's center of mass. It also moves the entire character, but IK effectors remain at a fix position.
 
The Spine Layer
 
!IMAGE!Pictures/mx220-spine.png!/IMAGE!
 
The spine is an immedate child of the root bone.
 
* spine:
 
* spine-1:
 
* chest:
 
* chest-1
 
* neck
 
* head
 
On this layer the two clavicle bones are also found. They are children of chest-1 and also appear on the arm layers.
 
The Head Layer
 
!IMAGE!Pictures/mx230-head.png!/IMAGE!
 
This layers contains the bones of the head (not the Neck and Head bones, however).
 
* jaw: Jaw bone
 
* tongue_base: Inner part of tongue.
 
* tongue_mid: Middle part of tongue.
 
* tongue_tip: Outer part of tongue.
 
* eye.L: Left eye.
 
* uplid.L: Left upper eyelid.
 
* lolid.L: Left lower eyelid.
 
* eye.R: Right eye.
 
* uplid.R: Right upper eyelid.
 
* lolid.R: Right lower eyelid.
 
* gaze. Target that the left and right eyes are tracking.
 
The FK Arm Layers
 
!IMAGE!Pictures/mx240-fkarm.png!/IMAGE!
 
The left forward kinematics (FK) arm layer consists of:
 
* clavicle.L: Left clavicle.
 
* deltoid.L Left deltoid muscle.
 
* upper_arm.fk.L: Left FK upper arm (humerus).
 
* forearm.fk.L: Left FK forearm. Y rotation is locked on this bone. To twist the forearm, rotate the hand bone instead.
 
* hand.fk.L: Left FK hand.
 
The bones on the right FK arm layer are analogous.
 
Note that there are two shoulder bones: clavicle and deltoid. It is not trivial to pose the shoulder, and with two shoulder bones better deformation is possible. The drawback is that an additional bone must be posed. However, in the common case that the arm is well below shoulder level, the deltoid bone can often be ignored.
 
The FK Leg Layers
 
!IMAGE!Pictures/mx250-fkleg.png!/IMAGE!
 
The left FK leg layer consists of:
 
* thigh.fk.L: Left FK thigh.
 
* shin.fk.L: Left FK shin.
 
* foot.fk.L: Left FK foot.
 
* toe.fk.L: Left FK toes.
 
THe right FK layer is analogous.
 
The Finger Layers
 
!IMAGE!Pictures/mx260-fingers.png!/IMAGE!
 
The long fingers on the Finger layers give a quick way to pose the fingers; the poses can be fine-tuned with the individual finger links on the Links layer.  The bones on the left Finger layer are:
 
* thumb.L: Left long thumb.
 
* index.L: Left long index finger.
 
* middle.L: Left long middle finger.
 
* ring.L: Left long ring finger.
 
* pinky.L: Left long pinky.
 
The bones on the right Finger layer are analogous.
 
Rotating a long finger around the local X will cause all finger links to curl (rotate around local X). Rotation around local Z only affects the first finger link.
 
The Finger Link Layers
 
!IMAGE!Pictures/mx270-links.png!/IMAGE!
 
Each of the Finger Links layers contains fourteen bones; two for the thumb and three for each other finger. They are used to fine-tune the finger pose, once a rough pose is achieved with the long finger bones.
 
The Palm Layers
 
!IMAGE!Pictures/mx280-palm.png!/IMAGE!
 
The left Palm layer contain the meta-carpal bones:
 
* thumb.01.L: Parent of the left thumb bones.
 
* palm_index.L: Parent of the left index finger bones.
 
* palm_middle.L: Parent of the left middle finger bones.
 
* palm_ring.L: Parent of the left ring finger bones.
 
* palm_pinky.L: Parent of the left pinky bones.
 
The bones on the right Palm layer are analogous.
 
The IK Arm Layers
 
!IMAGE!Pictures/mx300-ikarm.png!/IMAGE!
 
The arms are controlled by FK by default, but inverse kinematics (IK) can also be used. To quickly switch between FK and IK, press the buttons labelled FK or IK in the MHX FK/IK Switch panel, which appears in the N-panel if an MHX armature is the active object.
 
* clavicle.L: Left clavicle. Same bone as on FK layer.
 
* deltoid.L: Left deltoid. Same bone as on FK layer.
 
* hand.ik.L: Left hand IK effector. Positions and rotates the hand. Also controls forearm twist.
 
* elbow.pt.ik.L: Left elbow pole target. Controls the location of the elbow, which lies in the plane spanned by the arm socket (head of upper arm), wrist (tail of forearm), and elbow pole target.
 
* elbow.link.L: A rubberband that connects the elbow to the elbow pole target. This bone is merely a visual cue; has no effect and can not be selected.
 
The bones on the right IK arm layer are analogous.
 
The IK Leg Layers
 
!IMAGE!Pictures/mx310-ikleg.png!/IMAGE!
 
The legs are controlled by FK by default, but IK can also be used. To quickly switch between FK and IK, press the buttons labelled FK or IK in the MHX FK/IK Switch panel, which appears in the N-panel if an MHX armature is the active object.
 
* foot.ik.L: Left hand IK effector. Positions and rotates the hand.
 
* toe.rev..L: Left reverse toe.
 
* foot.rev.L: Left reverse foot.
 
* knee.pt.ik.L: Left knee pole target. Controls the location of the knee, which lies in the plane spanned by the leg socket (head of thigh), ankle (tail of shin), and knee pole target.
 
* knee.link.L: A rubberband that connects the knee to the knee pole target. This bone is merely a visual cue; has no effect and can not be selected.
 
* shin.ik.L: The foot bones and the pole target control the thigh and shin rotation, except for the twist (local Y rotation) of the shin, which is handle by this bone instead. The IK shin bone does not affect the bending of the shin, only the twisting.
 
The bones on the right IK leg layer are analogous.
 
!IMAGE!Pictures/mx320-revfoot2.png!/IMAGE!
 
The IK leg has an reverse foot setup, which allows the foot to rotate around the toe, ball, and heel.
 
* foot.ik.L: Rotate around heel.
 
* toe.rev.L: Rotate around toe tip.
 
* foot.rev.L: Rotate around ball.
 
FK-IK Snapping
 
In a rig with both FK and IK, it is useful to be able to switch between the two seamlessly. To this end, the MHX FK/IK Switch panel contains buttons to snap the IK to FK and vice versa, in such a way that the character's pose remains unchanged.
 
!IMAGE!Pictures/mx330-fkik2.png!/IMAGE!
 
To snap the IK bones to the FK pose, press the buttons
 
* Snap L IK Arm
 
* Snap R IK Arm
 
* Snap L IK Leg
 
* Snap R IK Leg
 
respectively. Note that the FK buttons at the top of the panel change to IK.
 
!IMAGE!Pictures/mx340-ikfk2.png!/IMAGE!
 
To snap the FK bones to the IK pose, press the buttons
 
* Snap L FK Arm
 
* Snap R FK Arm
 
* Snap L FK Leg
 
* Snap R FK Leg
 
respectively. Note that the IK buttons at the top of the panel change to FK.
 
The Tweak layer
 
!IMAGE!Pictures/mx360-tweak2.png!/IMAGE!
 
This layer contains some rarely used bone that in some cases are useful to tweak a pose.
 
* breast.L: Controls rotation of left breast. Useful mainly for female characters.
 
* arm.socket.L: The left arm socket. Allows the left upper arm (FK and IK) to be moved away from the deltoid bone.
 
* leg.socket.L: The left leg socket. Allows the left thigh (FK and IK) to be moved away from the hip.
 
The right-side bones (with suffix .R) are defined analogously.
 
!IMAGE!Pictures/mx370-markers2.png!/IMAGE!
 
The Tweak layer also contains marker bones. They are used by MakeWalk to determine the location of the toe, ball and heel, so they can be moved up above floor level. Move these bones in Edit mode to fine-tune their locatation. The marker bones on the left side are:
 
* toe.marker.L: Left toe marker.
 
* ball.marker.L: Left ball marker.
 
* heel.marker.L: Left heel marker.
 
The marker bones on the right side are analogous.
 
The MHX Control Panel
 
The default behaviour of the MHX rig can be modified by changing some properties in this panel.
 
!IMAGE!Pictures/mx410-gaze2.png!/IMAGE!
 
* GazeFollowsHead: If 1.0, the gaze bone is parented to the head. If 0.0, the gaze bone remains fixed in space (it is parented to the master bone).
 
* RotationLimits: If 0.0, all Limit Rotation constraint is ignored. If 1.0, they are honored. Default = 0.8.
 
!IMAGE!Pictures/mx420-hinge2.png!/IMAGE!
 
* ArmHinge, Left and Right: Determines whether the arm rotates when the shoulder does.
 
* FingerControl, Left and Right: Determines whether the finger links are parented by the long fingers (the default), or if the long fingers are to be ignored.
 
!IMAGE!Pictures/mx430-leghinge2.png!/IMAGE!
 
* LegHinge, Left and Right: Determines whether the leg rotates when the hip does.
 
!IMAGE!Pictures/mx440-legik2ankle2.png!/IMAGE!
 
* LegIkToAnkle, Left and Right: If turned on, the reverse foot setup is disabled, and the ankle bones on the Extra layers become the effectors for the IK legs.
 
 
 
=== MHBlenderTools: MHX other rigging systems ===
 
By default (i.e., if Pose/Animate > Skeleton is set to 'None') a character is exported as an mhx file is rigged with the MHX rig, which is a rather complex rig with many options. Alternatively, the character can be rigged with a simpler and ligher armature by choosing an alternate Rig Preset under Pose/Animate > Skeleton.
 
!IMAGE!Pictures/mx710-other-rig2.png!/IMAGE!
 
To this end, select one of the rigs in the Pose/Animate > Skeleton tab. A skeleton is drawn. Export the character from MakeHuman as an mhx file.
 
To reselect the default mhx rig, press None in the Pose/Animate > Skeleton tab.
 
!IMAGE!Pictures/mx720-other-blender2.png!/IMAGE!
 
Import the mhx file into Blender. Note that there are fewer bone layers available in the MHX Main panel.
 
 
==== Using the Rigify rig ====
 
 
As an alternative to the mhx rig, MakeHuman characters can also be rigged with the popular Rigify add-on by Nathan Vegdahl. Links to documentation about Rigify in general:
 
!LINK!http://wiki.blender.org/index.php/Extensions:2.6/Py/Scripts/Rigging/Rigify -- Official Rigify documentation!/LINK!
 
!LINK!http://blenderartists.org/forum/showthread.php?200371-Rigify-Auto-rigging-system-new-and-improved -- Blenderartist thread!/LINK!
 
!IMAGE!Pictures/mx810-rigify-export2.png!/IMAGE!
 
To use Rigify, the character must first be prepared in MakeHuman. Select the Export for Rigify checkbox in the Options group and export the file. This checkbox overrides any other rig selected under the Skeleton tab.
 
!IMAGE!Pictures/mx820-rigify-enable2.png!/IMAGE!
 
In Blender, enable the Rigify add-on. It is found in the Rigging category. Press Save User Settings to remember the choice.
 
!IMAGE!Pictures/mx830-rigified2.png!/IMAGE!
 
Import the mhx file as usual. John Doe is now rigged with Rigify and ready for use.
 
!IMAGE!Pictures/mx840-norigify2.png!/IMAGE!
 
If the Rigify add-on is not enabled, an intermediate rig is imported and an error message appears. If this happens, open a new Blender file, enable Rigify, and try again. Alternatively, you may select the armature and press the Rigify MHX Rig button that appears in the N-panel (after you have enabled the Rigify add-on, of course).
 
Disclaimer: MakeHuman has no control of the Rigify add-on. Rigify support was tested with Blender 2.69. If the specification of Rigify changes in the future, mhx file exported for Rigify may cease to work.
 
 
 
 
=== MHBlenderTools: MHX Layers and masks ===
 
The mhx importer creates objects on the first four layers.
 
!IMAGE!Pictures/mx900-layers2.png!/IMAGE!
 
* * This layer contains the body, including hair, eyes, etc.
 
* This layer contains the armature.
 
* This layer contains any clothes. Empty if the character is nude.
 
* This layer contains any proxy meshes. Empty if no proxy mesh was exported.
 
Note that a proxy mesh does not replace the original human mesh, but is added as a separate object. If you only want to use the proxy mesh, the body can be deleted. However, the proxy mesh (perhaps slightly edited) has several uses apart from simply replacing the body, e.g. as a deflection surface for cloth simulation, or to speed up viewport performance when animating. It is therefore up to the user to decide which meshes to delete.
 
!IMAGE!Pictures/mx920-unmasked2.png!/IMAGE!
 
Vertices beneath clothes are hidden, to avoid ugly penetration issues. The picture above illustrates what may happen if skin under clothes is not hidden.
 
!IMAGE!Pictures/mx910-mask2.png!/IMAGE!
 
However, skin beneath clothes is not removed, only hidden by masks. This becomes evident if in edit mode, where the entire body is visible. To undress the character, deselect the clothes layer (layer 3), and disable all mask modifiers. Don't forget to disable the mask modifier for rendering as well.
 
With the mask modifiers, it is possible to export a character from MakeHuman with several different outfits, and select the outfit and hidden body vertices in Blender.
 
If you wish to delete the hidden vertices permanently, e.g. to improve performance or for export to some other application, apply the relevant mask modifiers.
 
 
 
=== MHBlenderTools: MakeClothes rigid fitting ===
 
The standard clothes fitting algorithm is suitable for flexible clothes, especially clothes that follow the body closely. However, is does not work well for rigid objects like shoes, Therefore, MakeClothes has a variant suitable for rigid shoes. Rigid fitting is used on a vertex group basis, so rigid and flexible fitting can be mixed in the same piece of clothing.
 
Rigid fitting is used for every vertex group whose name starts with a '*'. Each such vertex group in the human must contain exactly three vertices, that determine the triangle used for fitting in this group.
 
!IMAGE!Pictures/rf-100-orig.png!/IMAGE!
 
The use of rigid vertex groups is best explained by an example. Consider a pair of shoes, which is a rigid object. We used the standard MakeClothes settings and an adult human with the original shoe to the left. Then the mhclo file was loaded onto a baby, with the result to the right. The shoe is recognizable but it has been deformed in an undesirable way. However, even though the quality of the fitting is poor, make sure to save the mhclo file, because we will need it at the end of this tutorial.
 
!IMAGE!Pictures/rf-110-projection.png!/IMAGE!
 
Each clothing vertex v is associated with a triangle t in the human mesh, i.e. three human vertices v1, v2, v3. Letrdenote the location of v andr1,r2,r3the locations of the corners of the triangle. Further, let d be the perpendicular distance from the clothes vertex to the plane spanned by the triangle. MakeClothes assigns three weights w1, w2, w3, such that
 
    r= w1r1+ w2r2+ w3r3+d,
 
where the sum of weights w1+ w2+ w3= 1. When the clothing is loaded onto a different character, the new location becomes
 
    r'= w1r'1+ w2r'2+ w3r'3+d',
 
wherer'1,r'2,r'3are the locations of the three human vertices in the new character, and
 
    d' = Sd= (sxdx, sydy, szdz)
 
is obtained from the original offsetdby inhomogeneous global scaling with the diagonal scale matrix
 
    S = diag(sx, sy, sz).
 
The important observation is that the quality of the fitting depends crucially on the chosen triangle, i.e. the triple of human vertices the determines the clothes vertex location. The standard choice is to choose the human triangle closest to to clothing vertex. In rigid fitting the triangle is the same for all vertices, and is defined by the tree vertices in the vertex group.
 
!IMAGE!Pictures/rf-120-vgroups.png!/IMAGE!
 
Here we see the vertex groups used to fit the left shoe. The standard vertex group Left consists of all vertices close to the left shoe, in the tights helper. On the other hand, the rigid vertex group *Left only contains three vertices that define the single triangle used for all clothes vertices in this group.
 
!IMAGE!Pictures/rf-130-triangles.png!/IMAGE!
 
The next illustration shows the corresponding triangles used for fitting. The Left group uses many triangles obtained by triangulating the corresponding faces. Different shoe vertices are associated with triangles that are transformed differently, and the rigid character of the shoe is lost. The *Left group only has one single triangle, with corners at the three vertices in the vertex group. The same triangle is used by all left shoe vertices.
 
!IMAGE!Pictures/rf-140-rename-vgroups.png!/IMAGE!
 
Once the human vertex groups have been defined, we proceed with clothes-making. Rename the shoe vertex groups to *Left and *Right, to make the shoes use rigid fitting. Recall that the human can have several overlapping vertex groups, but in a piece of clothing each vertex must belong to exactly one group.
 
!IMAGE!Pictures/rf-150-offset-scaling.png!/IMAGE!
 
Next we open the Offset Scaling section and select Foot as the body part. In flexible fitting, the offsets are usually very small, and selecting the correct body part is not so important. In contrast, this step is very important in rigid fitting, because many vertices are far away from their triangles and the offsets must be scaled correctly. Press Examing Boundary for a visual inspection of which vertices define the global scale matrix..
 
!IMAGE!Pictures/rf-160-baby-shoes.png!/IMAGE!
 
This picture shows the shoe loaded on a baby, using flexible fitting (blue) and rigid fitting (red). Rigid fitting clearly maintains the shape of the shoe much better.
 
However, one problem remains: bone weighting. Start up MakeHuman and load the shoes under the Geometries > Clothes tab. Export the character with a rig in one of the formats that allows that, e.g. mhx, and import the character into Blender. Now try to pose the feet. As we see in the picture below, the shoe does not follow the foot but is squashed or stretched in strange ways.
 
!IMAGE!Pictures/rf-170-bad-deformation.png!/IMAGE!
 
The reason is that rigid fitting affects the bone weights as well as the vertex locations. The bone weights are interpolated between the three corners of the triangle, rather than using the information from the entire foot. To correct this we need to use two different associations between clothes and human vertices: rigid fitting for vertex locations, but flexible fitting for bone weights.
 
Fortunately, MakeHuman can handle this situation, although there is currently no elegant interface for it in MakeClothes. In a text editor, open the mhclo file obtained by standard fitting in the beginning of this tutotial (this is why you needed to save it), and copy everything after the line
 
verts 0
 
to the end of the new mhclo file obtained by rigid fitting. Before the copied section, insert the line
 
weighting_verts
 
Export the shoes again from MakeHuman and import into Blender. The shoes should now both maintain their rigid shape and deform correctly.
 
!IMAGE!Pictures/rf-190-good-deformation.png!/IMAGE!
 
Due to an unfortunate behavior in MakeHuman, it is possible that the updated mhclo file is not loaded after it has replaced the old one. This is because MakeHuman automatically compiles mhclo and obj files when loaded, so they can be loaded faster next time. Although this behavior is convenient for ordinary users, it is very confusing and frustrating for clothes makers. If MakeHuman insists on loading old versions of your clothes, you may need to delete the compiled files (*.mhpxy and *.npz), or restart MakeHuman, or both. After the compiled files are gone, the updated clothes should load without problems.
 
!IMAGE!Pictures/rf-180-compiled-files.png!/IMAGE!
 
 
 
 
 
 
 
 
 
== Developers' note ==
 
 
Some notes about how to obtain the source code, modify it and share it.
 
 
=== Makehuman Plugin System ===
 
 
==== Plugins ====
 
 
Makehuman has a simple plugin framework which makes it easy to add and remove features. At startup, MakeHuman now looks for .py files in the plugins folder which are not starting with an underscore (which makes it easier to disable unwanted plugins). It loads them one by one and calls the load entry point passing a reference to the application. The plugin can use this reference to add the necessary GUI widgets or code to the application.
 
 
 
The rules for plugins are very simple:
 
* A plugin is a .py file in the plugins folder with a load entry point.
 
* A plugin only imports core files.
 
The reason a plugin cannot import other plugins is that it would make it difficult to know which files belong to which plugin. We still need to define a convention for shared files beyond the core MakeHuman files. To get started look at example.py or any of the other plugins to see how you can create your own feature in MakeHuman.
 
 
==== GUI ====
 
 
The main layout  is a two level tab control. The tabs at the top represent categories, like files. modelling, geometries, materials, etc. The ones at the bottom are the tasks in the current category and refine the more broad category in face, torso, gender, saving, loading, exporting etc.
 
So when creating your plugin, the first thought should be “In which category does it belong?”. From experience we know that it can be a though question to answer. Sometimes the only answer is adding a new category. This is what we initially did for measurement for example
 
Next you probably want your own task to implement your feature. While it’s possible to attach functionality to an instance of gui3d.Task, it’s often easier to derive your own class.
 
 
 
When you create an instance of your class, you pass the parent of your task, which can either be an existing category
 
or the new one which you added.
 
In your derived task you will then add the necessary controls to let the user interact. A good place to see how to use the different controls is the example plugin. You will see that even if you don’t add any controls, the model is already visible. This is because the model is attached to the root of the GUI tree. In the onShow event of your task you might want to reset the camera position, like we do in the save task, or hide the model, like we do in the load task. Just don’t forget to reset the state when your task gets hidden in onHide.
 
 
==== Undo-redo ====
 
 
It is important that every modification is undoable, since just one undo able modification would leave the user without the possibility to undo anything. So it’s crucial that if you write a plugin which modifies the model, you also make undo work.
 
 
 
The Application class has several methods to work with actions. An action is a class with two methods, do and undo. If the action itself does the modification you can use app.do to add it to the undo stack. If you did the modification yourself already during user interaction, you can add the action using app.did. The application won’t call the do method of the action in that case.
 
 
 
If you want to make your own undoredo buttons, you can use app.undo and app.redo. To illustrate, here is the action we use to change the hair color:
 
The postAction is a handy way to specify a method to keep your GUI in sync with the changes.
 
 
==== Asynchronous Calls and Animation ====
 
 
When doing lengthy operations it is important not to block the GUI from redrawing. Since everything runs in one thread, it is easy to block the event loop in your plugin. There are 4 ways to avoid this, depending on the need. If no user interaction is needed, a progressbar can be used. A progressbar uses the redrawNow() method of the application. This redraws the screen outside the event loop. Instead of creating your own progressbar, it is advised to use the progress method, which uses the global progressbar. Calling progress with a value greater than zero shows the progressbar, a value of zero hides it.
 
If user interaction is desired during the operation, either asynchronous calls, a timer or a thread can be used. Asynchronous calls are used when a lengthy operation can be split in several units. It is used for example in the startup procedure as well as for the plugin loading loop. The mh.callAsync(method) queues the calling of method in the event loop, so it will be called when the event gets processed. In case different methods need to be called after each other, as in the startup procedure, callAsync is used to call the next method.
 
In case of the plugin loading loop, it calls the same method until it is done.
 
This is not to be used for animations, as it takes very little time between calling callAsync and the event loop calling the method. Calling time.sleep(dt) to avoid this should not be done as it blocks the main thread. For animations use timers instead. An example of this can be found in the BvhPlayer plugin. The method mh.addTimer(interval, method) adds a timer which calls the given method every interval milliseconds. It returns a value to be used by removeTimer to stop the timer.
 
If a lengthy operation includes blocking on sockets or pipes, it is advised to use a python thread. However this has been shown to be problematic on Linux. To get around the problems on linux you should not access any makehuman structures from within your thread, but use mh.callAsync to call the methods from the main thread. See the clock plugin example for example code on how to use threads correctly.
 
 
==== A template for plugins ====
 
 
Looking in the makehuman source folder, in the "plugin" directory, you will notice a file called "7_example.py". This is an "Hello world" plugin. It includes all main controls and some nice example of mesh manipulation. It's designed to be used as template of new plugin.
 
To see how it works, you need to enable it in the Preferences->Setting. Then it will be located at Utilities-->Example.
 
!IMAGE!Pictures/UtilitiesExampleTab3.png!/IMAGE!
 
 
==== Example Controls ====
 
 
!IMAGE!Pictures/UtilitiesExampleControls.png!/IMAGE!
 
 
 
=== OpenGL Notes ===
 
Most of the 3D mesh handling functionality is delivered using OpenGL embedded within the C application code. OpenGL is a 3D graphics library that enables a 3D world to be defined, with a camera, objects, lights, textures etc. It then enables that 3D world to be visualised as a 2D representation that can be displayed on a computer screen and provides functions to enable an application (in response to user input) to navigate around the scene.
 
 
==== Basics ====
 
 
OpenGL takes a 3D scene and draws it into a 2D viewing area on your screen known as the viewport. OpenGL can project the scene onto the viewport in a variety of different ways, but the most common are:
 
* Perspective projection, as you would get if you could place a camera in the scene
 
* Orthographic projection, as a draftsman may contstruct technical drawings such as plans and elevations.
 
MakeHuman only uses Perspective projection.
 
 
==== Projection Transformation ====
 
 
You tell OpenGL how to project 3D objects onto the 2D viewport by defining a projection transformation, which indicates whether you wish to use perspective or orthographic projection (or an alternative projection pattern) and specifies other projection settings. This can be imagined as being comparable to specifying the characteristics of a camera (field of view, aspect ratio etc.) where an orthographic projection is equivalent to a camera at an infinite distance. The main difference being that with OpenGL you can change the settings between drawing different objects, which is a bit like taking a photo, changing the lens and moving the camera, then taking another photo without winding the film on.
 
MakeHuman sets a perspective projection using the function gluPerspective(fovAngle,aspectRatio,near,far) where:
 
* fovAngle: is the vertical field of view angle (45 degrees)
 
* aspectRatio: is the viewport width divided by the viewport height which, by default, is 800/600 (4/3)
 
* near: is the distance to the centre of the viewing plane (0.1)
 
* far: is the distance to the centre of the rear clippling plane (100)
 
 
==== Modelview Transformation ====
 
 
For OpenGL to know what to display in the viewport it also needs to know where the camera is relative to the 3D model. For this you need to define a modelview which defines the position and orientation of the camera and the position and orientation of objects that go to make up the model inside a virtual 3D world.    You do this by moving through and around the 3D world and by creating objects relative to your current position. OpenGL keeps track of your current position and orientation in the 3D world by recording the modelview transformation.
 
 
 
Both the projection transformation and the modelview transformation are stored internally in 4x4 transformation matrices. You can modify these matrices by calling functions that apply direct transformations (translations and rotations) or by calling functions that calculate transformations (e.g. the gluPerspective function outlined above).
 
 
 
To apply transformations you need to first make one of these matrices the current matrix. Subsequent transformations are applied cumulatively to the current matrix. The glMatrixMode function is used to set the current matrix mode:
 
* glMatrixMode(GL_PROJECTION) makes the projection transformation the current matrix so that projection settings can be applied, such as the field of View angle and Aspect Ratio.
 
* glMatrixMode(GL_MODELVIEW) makes the modelview transformation matrix the current matrix so that subsequent translations and rotations move the current location through the 3D world.
 
 
==== Resetting a transformation matrix ====
 
 
Matrix transformations that you apply are applied cumulatively. It is therefore often necessary to reset the matrix to its default ‘no transformation’ state so that you can apply a fresh set of transformations. OpenGL uses the transformation and modelview matrices to transform coordinates by performing a matrix multiplication. If you multiply by an identity matrix then the result is the same as the thing you started with, so to reset one of these matrices to an initial state where the transformation has no effect you can simply load the identity matrix using the function glLoadIdentity(). This loads the identity matrix into whichever transformation matrix is currently active, resetting it to the default ‘no transformation’ state.
 
 
==== Coordinates ====
 
 
OpenGL world space coordinates are unitless, so 2 units can represent milimeters, inches or light years. Interpretation of world space coordinates is up to the application. OpenGL uses a right handed coordinate system, so, if you are looking at the origin with the +X axis stretching away to the right and the +Y axis pointing straight up, the +Z axis will be heading out of the screen straight towards you and the -Z axis will be dissapearing away from you into the distance.
 
 
 
MakeHuman uses the default OpenGL modelview camera settings which are equivalent to invoking gluLookAt(0,0,0, 0,0,-1, 0,1,0). This places the camera at the origin, looking straight into the model along the -Z axis with the +Y axis pointing straight up. This means that a 2 unit wide and 2 unit high object centred at <0,0,-1> will roughly fill the viewport, as will a 4x4 object at twice the distance from the viewpoint.
 
 
==== Drawing objects ====
 
 
MakeHuman contains multiple objects. The Humanoid object is the main object, but it is surrounded by a set of control objects. Each object is constructed using triangular faces. The position and orientation of each face is defined relative to the object of which it is a part. The positions and orientations of the vertices and normals of the faces are defined relative to the face they’re on. The sequence in which face vertices are defined is significant in OpenGL. If the face is viewed from the front, the sequence of points runs counter clockwise. If the face is viewed from the back the points run clockwise. Faces viewed from behind may be invisible or may have a different color/texture defined to the front face. This order is also called the "winding order" of vertices, and is used by MakeHuman to calculate the normal of each face. If the winding order is reversed, the normal will be flipped. The glPushMatrix and glPopMatrix functions can be used to store away a copy of a matrix that can subsequently restored. MakeHuman uses this before and after drawing each object to store a copy of the modelview matrix and restore it.
 
 
 
 
=== Directory structure and core modules ===
 
Overview
 
There are 10 main source folders with the main MakeHuman folder in the SVN repository:
 
* apps
 
* core
 
* data
 
* docs
 
* icons
 
* lib
 
* licences
 
* plugin
 
* shared
 
Of these, two folders: (licenses and docs)  are associated with the general housekeeping and installation aspects of MakeHuman.  Details of the housekeeping  folders will be summarized, where relevant, elsewhere in the documentation.  The remaining folder are discussed below.
 
 
Directory Structure and the Build System
 
During the pyinstaller build process, only files in the makehuman/data/ directory are included as data files, which means that any other data files will be skipped, causing file not found errors. Thus, the developer should place all data files under the data/ folder. The idea is to be able to cleanly identify what is data to be loaded at run time.and what is code that can be compiled at build time. One exception here is target (data) files that can optionally be precompiled to binary files at build time even though their binaries will reside in the data/ directory. The rational for this is that target files are static data core assets and should therefore naturally reside in makehuman/data/, but by compiling target files at build time, the size of the build distribution can be reduced and program startup times can be minimized
 
The files in makehuman/plugins/ form a second exception to data is "loaded at runtime" and code is "comiled at build time". Files in the makehuman/plugins/ folder are kept as a "side effect" of the fact that the makehuman/plugins/ path is included as data. It is included as data to make it possible to load plugins at runtime by inspecting the folder. Plugins are compiled as well, though. This does not mean that data may be stored in makehuman/plugins/ as this practice threatens logical organization of code and data, and it is not guaranteed that this will keep working in the future.
 
Plugins should never be used as imported modules in other plugins or, even worse, in a core module. Currently, the core plugins of MakeHuman all start with a digit (e.g., import 9_export_mhx). This is done intentionally to prevent their improper import into other modules in the program because imports of a module starting with a digit will produce a syntax error. Core plugins that require data, for example those related to .mhx export, should make use of a subfolder in the makehuman/data/ path (makehuman/data/mhx/).
 
Third-party plugins might have slightly greater lattitude about including data than core assets, but the clean separation of code and data is strongly encouraged.  Therefore, third-party plugin developers should place data in makehuman/plugins/pluginName/data to make it easily identifiable.
 
makehuman/data/ folder
 
Most operating systems set aside separate storage space for code and data, and programs can generally not write data to their own directory structure.  In addition, most OS support the notion of multiuser login, and support separate writable data areas for each user to customize.  MakeHuman includes some "fixed data assets" that are intrinsic to the program.  These data assets are save in the programs data folder within the program itself.  However, MakeHuman also makes provision for third-party and user developed assets like additional clothes, additional hairstyles, etc.  These latter assets are not saved in the program area but rather are save in user writable areas under the path  userhome/makehuman/. For example, on Windows 7 systems this would be "Docuements/makehuman".  In populating asset lists, the library holdings are generated based on the fact that syspath/data/itemtype and userhome/data/itemtype can coexist.
 
Themakehuman/data/ folderdoes not contain any codeper se. As its name implies this folder contains the basic data resources that ship with MakeHuman.  Often, during the development and building of MakeHuman, a given resource may exist in multiple formats.  Many if not most resources initially exist as “text-based” files which are stored in the data directory with an appropriate extension (see section of file extensions).  When MakeHuman is built for distribution on supported platforms, these text-based data resources are compressed into .npy format to produce a smaller and more rapidly loading distribution package.  Typically, the original text-based data files do not ship with standard, downloadable installation builds but instead, the corresponding compressed versions are distributed.
 
The data folder contains many subfolders which correspond to unique data assets.  The details of these assets will not be discussed here, but the reader can get a general idea of asset types just by reviewing the folder names.
 
3dobjs, animations, bvhs, expressions, eyebrows, eyes, genitals, hair, icons, languages, litspheres, materials, mhx, poses, povray, proxymeshes, rigs, scenes, shaders, skins, targets, textures, themes, uvs, vertesgroups, visemes.
 
Many of these data subfolders have subfolders themselves, but generally, this organization is easily discovered just by looking.
 
Four of the data subfolders (eyes, hair, and clothes) are directed at important geometry assets of the MakeHuman model.  MakeHuman supplies a special tool that lets developers or even end-users design additional assets using the 3D modeling program Blender.  The name for this tool isMakeClotheseven though it supports the development of the other geometries as well.
 
 
 
==== /makehuman/apps/ ====
 
 
* catmull_clark_subdivision.py- an implementation of the!LINK!http://en.wikipedia.org/wiki/Catmull%E2%80%93Clark_subdivision_surface -- catmull clark subdivision algorithm.!/LINK!
 
* devtests.py- testing and development use only, and can safely be ignored.
 
* human.py- contains the Human class, which is the core data structure for MakeHuman model.
 
* humanmodifier.py- contains the functions and classes that link the GUI sliders to their respective effects on the MakeHuman mesh.
 
* metadataengine.py- implements algorithms to handle MakeHuman metadata tags within large files.  It makes it possible to quickly and easily associate descriptive adjectives (tags) such as "eye" or "brown" with the complete file path to an object that has these characteristics.
 
* mh2proxy.py- (unknown)
 
* posemode.py- contains classes and functions relevant to entering and exiting posemode, and altering a figure's pose. Changing the pose of a figure will deform the mesh and require the vertex warps to be reset.
 
* warpmodifier.py- contains classes and functions to handle the source and target of a warp.
 
* which.py- checks to see whether or not a program exists (needed for the GUI controls).
 
 
==== /makehuman/core/ ====
 
 
* algos3d.py- contains algorithms used to perform high-level 3D transformations on the 3D mesh that is used to represent the human figure in the MakeHuman application.
 
* aljabr.py- contains the most common 3D algebraic operations used in MakeHuman including vector, matrix, quaternion, and various other operations.
 
* animation3d.py- contains functions and classes to animate a wide range of objects. Includes support for rotation, scaling, translation, camera movement, and variety of interpolation methods.
 
* compat.py- (Unknown)
 
* download.py- contains classes and functions to download media from the web and import it into MakeHuman.
 
* events3d.py- contains classes to allow an object to handle events resulting from keyboard or mouse input, window resizing, and changes to the human model with MakeHuman.
 
* export.py- (Not fully implemented) contains classes to export the MakeHuman model.
 
* files3d.py- contains functions to convert other 3D file formats to and from the internal format used by MakeHuman.
 
* geometry3d.py- contains classes for commonly used 2D and 3d objects such as rectangles, cubes, and flat meshes.
 
* gui3d.py- contains classes defined to implement widgets that provide utility functions to the graphical user interface.
 
* mhmain.py- contains the operations and event handlers that provide the core functionality of MakeHuman.
 
* module3d.py- contains classes and functions to handle the appearance and behavior of 3D objects such as shading, texturing, visibility, transparency, and creating groups of faces.
 
* selection3d.py- contains classes and functions that allow users to select elements within a 3D scene by clicking on them with the mouse using a technique called "!LINK!http://www.opengl.org/archives/resources/faq/technical/selection.htm -- Selection Using Unique Color ID's!/LINK!"
 
* textures3d.py- contains functions to perform standard proccesses on bitmaps and translate UV coordinates to a pixel index in a bitmap.
 
* transformations.py- A library for calculating 4x4 matrices for translating, rotating, reflecting, scaling, shearing, projecting, orthogonalizing, and superimposing arrays of 3D homogeneous coordinates as well as for converting between rotation matrices, Euler angles, and quaternions. Also includes an Arcball control object and functions to decompose transformation matrices.
 
* warp.py- contains classes and functions to warp vertex locations from a source character to a target character. This makes it possible to correctly combine several morphs that affect overlapping regions of the body.
 
 
==== /makehuman/lib/ ====
 
 
* camera.py -handles camera events such as changing focus, camera mode, and field of view.
 
* core.py -sets default global variables
 
* debugdump.py -handles creating a debug text file in the user's home directory and writing relevant debug information to that file.
 
* filechooser.py- a Qt based filechooser widget that allows the user to preview and select files as well as sort them by name, creation date, modification date, and size.
 
* getpath.py- Utility module for finding the user's home path.
 
* glmodule.py- contains classes and functions to render 3D objects with openGL in both draw mode and pick mode.
 
* gui.py -alias from gui.py to qtgui.py
 
* image.py- handles flipping and resizing images as well as converting between RGB, ARGB, and greyscale.
 
* imageqt.py- handles loading and saving RGB and ARGB images.
 
* inifile.py- contains functions for formatting and parsing .ini files.
 
* language.py- handles language file loading and translation.
 
* log.py- extends the functionality of Python's logging module. The logging module can be used to create log files of events for debugging, or to display information or warnings within MakeHuman. It is used instead of print statements.
 
* matrix.py- uses the NumPy package to define standard matrix operations.
 
* mh.py- Python compatibility layer that replaces the old C functions of MakeHuman.
 
* object3d.py- defines the object3d class
 
* profiler.py- defines functions to handle profiling (how long various parts of the program executed).
 
 
 
 
=== File formats and extensions ===
 
This section briefly reviews the file accessed or manipulated within MakeHuman.  An attempt will be made to explain the function of each file format and where in MakeHuman’s modules it is used.  Or external formats links will be provided to more detailed descriptions.  Where it can help the developer better understand the overall organization of MakeHuman source code, some information on the module group or modules interaction with the various formats will be provided.  This section Is intended to be used with the previous section on directories (folders) to develop an overview of MakeHuman code structure.  This structural understanding is essential during the debugging process and is essential to expanding the programs functionality in structurally sound ways.
 
 
==== .obj files ====
 
 
This extension designates Lightwave object files. These OBJ files store mesh information and metadata needed for rendering in text-readable format. It was originally developed for the Lightwave 3D graphics program, but this file format is now broadly supported by many, if not most, 3D programs.  Internally, MakeHuman uses OBJ files (or formats directly based on OBJ files) for the canonical storage of its assets. Many of MakeHuman’s development utilities store newly generated assets in formats related to the Lightwave OBJ specification.  Examples of assets canonically stored in Lightwave OBJ format include the base human, and auxiliary geometry assets like eyes, hair, and clothes.
 
Each OBJ file has a record specifying the object’s name and records specifying one or more mesh objects.  Mesh objects, in turn, are composed of record groups specifying: points or vertices (v), vertex normal (vn), faces (f), and vertex textures (vt). In MakeHuman and most general purpose 3D programs (e.g., Blender) OBJ files are exported with an accompanying Wavefront material file having the .mtl extension. These MTL files provide the detailed material attributes necessary to get photorealistic rendering of Wavefront objects.  A more complete specification of Wavefront object format can be found at:!LINK!http://www.martinreddy.net/gfx/3d/OBJ.spec -- http://www.martinreddy.net/gfx/3d/OBJ.spec!/LINK!
 
OBJ files and MTL files have a direct internal analog in MakeHuman in the form of the object class, Object3D.  This class, along with its properties and methods, is defined in the object3D.py module(libs folder).  Object3D objects are central to the internal workings of MakeHuman. The resemblance of Object3D objects to OBJ files is not coincidental. Many of the core mesh assets in MakeHuman are developed in the open-source project, Blender, and moved to MakeHuman internals using OBJ format.  In fact, MakeHuman maintains a separate product [MHTools] designed to support development of new core internal assets
 
 
==== .mhpxy files ====
 
 
This extension designates compressed binary proxies and clothes.
 
 
==== .mtl files ====
 
 
This extension designates a Wavefront material file. Wavefront MTL files are referenced from within Wavefront OBJ files to specify the materials used in rendering. More complete information on Wavefront material files is available here:!LINK!http://paulbourke.net/dataformats/mtl/ -- http://paulbourke.net/dataformats/mtl/!/LINK!
 
MakeHuman’s use of MTL file has been limited, but with the increased support for POV rendering, more modules may access and utilize MTL files. Currently, MTL-related data and files is developmental.  Thedata foldercontains a cube.obj (3dobj) and a plane.obj (mitsuba folder) ostensibly associated with materials.
 
 
==== .npz files ====
 
 
This extension designates compressed binary asset files in MakeHuman [think npz = numPy zip files]. Among the most important assets stored with this extension is target vertex information.  These target data determine the motion of each MakeHuman vertices (relative to landmark vertices?) as sliders are moved.
 
Thefiles3D.pymodule (core folder) tests for verifies existence of OS dependent locations of binary mesh files. The primary purpose of thefiles3D.pymodule is to transpose imported 3D data into a standard internal format for each of the 3D file formats supported by the MakeHuman import. Each importer function is written to return the 3d data in a standard format as a Python list:
 
[verts, vertsSharedFaces, vertsUV, faceGroups, faceGroupsNames]
 
Thetargets.pymodule (lib folder) is where system assets stored with the NPZ extension are identified, decompressed, and turned into memory resident structured objects for use by MakeHuman.  This
 
Code related to NPZ files can also be found inalgos3d.pymodule (core folder).  This module has been part of MakeHuman for a long time, and is used for morphing related operations.  It works with loaded binary translation vectors and applies those translations to morphing targets. It may have some legacy capability to work directly with text-based .target files.
 
Thecompile_models.pymodule (lib folder) takes all object mesh files (.obj) and compiles them into binary compressed files (NPZ). Similarly, thecompile_targets.pymodule (lib folder) compiles target files (.target) into binary compressed (NPZ) files.  The latter module is referenced by Benjamin Lau inMakeHuman.specfile, likely for build purposes.
 
Important concept.  The root folder of MakeHuman contains a batch file calledcleanpz.bat(for Windows) and a corresponding shell script file calledcleanpz.sh(for Linux) whose function is to delete NPZ files between SVN builds. This “garbage collection” assures that each build is not working with out-of-date assets.  It also reminds the developer that most of the canonical assets are initially constructed in text readable format. Developers write appropriate constructors to recompile NPZ files during the MakeHuman startup sequence when missing.  For distribution versions of MakeHuman, pre-compiled assets in the form of NPZ files are distributed as stable assets. When start up code confirms their existence , no recompile is done, producing a better start-up experience for the user. This design greatly increases boot and runtime performance for stable distributions without hampering rapid testing of SVN versions in interpretative mode.
 
 
==== .thumb files ====
 
 
This extension designates thumbnail (image preview) files in MakeHuman. Underneath, they are actually images constructed in the .png file format. Thus, they can be edited by standard image editors like GIMP or displayed by most web browsers. The MakeHuman Save Tab code automatically generates a companion .thumb file each time it saves a .mhm file. The .thumb generation is currently done by capturing a rectangular in the center of the viewport window.  The MakeHuman Load Tab code uses .thumb images to provide previews of saved models.
 
The format for .thumb is defined inimage_qt.pymodule (GUI folder). Load and Save use or generation occur inguiload.pyandguisave.pymodules, respectively (lib folder).  Numerous plugin modules reference .thumb files (plugins folder) including:2_posing_expression.py,3_libraries_clothes_chooser.py,3_libraries_eye_chooser.py,3_libraries_material_chooser.py,3_libraries_polygon_hair_chooser.py,3_libraries_posing_chooser.py,3_libraries_proxy_chooser.py, 4_rendering_scene.py,7_material_editor.py,
 
 
==== .lmk files ====
 
 
This MH internal extension designates landmark files. [might be renamed .mhlmk for naming standard reasons.] There is one set of landmarks for the model body (and a second set of landmarks for the model head (data | landmarks folder).  These LMK files are written in text readable format.  Each line has a single integer which specifies a landmark vertex in the model geometry. Landmark vertices are used for relative (local) spatial reference during model morphing.   
 
Landmark files are used bywarpmodifier.py(apps folder) andwarp.py(utils | tools | warptarget folder).The landmarks in these files are used as references during the complex process of reshaping the human.
 
 
==== .mhclo files ====
 
 
This MH internal extension designates MakeHuman clothes files. These files define clothing vertexes and remove vertexes from the model to allow fitting of the model without skin showing through. They are created using the MakeHuman Blender toolset, specifically the MakeClothes tool.
 
A special empty MakeHuman clothes meta-file namedclear.mhclois is used to restore the default UV map that comes with the system.  In the lib folder, thefilechooser.pymodule has aMhcloFileLoader.Refresh()method to clear materials when the refresh button is clicked.
 
In the plugins folder,3_libraries_clothes_chooser.py,3_libraries_eye_chooser.py,and 3_libraries_polygon_hair_chooser.pyall access MHCLO files, but as of this writing, there are some file handling issues related to proxy clothes that means some of this code is commented out.  In the shared/export_utils folder, config.py accesses acage.mhclofile.  In the tools folder, the tool ‘MakeClothes’ accessws MHCLO files in both the  makeclothes.pyandmaterial.pymodules.  Similarly, the tool ‘MakeTarget‘ accesses MHCLO files in several of its modules:_init_.py, convert.py,import_obj.py,mt.py, andsettings.py.  Finally, two experimental tools in the utils folder, themakeface.pyandhelpers.pymodules access MCHO files.
 
 
==== .mhmat files ====
 
 
This extension designates MakeHuman material files.    Material files are text-readable files that share their syntax structure with Lightwave OBJ and MTL files.
 
In the app module group,human.pyapplies a default material andmh2proxy.pyhandles materials for proxies and may do some special material handling for MHX Blender export files. In the data folder, the MHCLO files for clothes, eyes, and hair each references an MHMAT files. In the plugins folder,7_material_editor.pyloads, edits, and saves MHMAT files.  In the shared folder,material.pyparses MHMAT files and sets material properties.  In the tools folder, there is another (independent)material.pymodule for the MakeClothes tool that writes materials to the output file while using that tool within Blender.
 
=== Description of the file format ===
 
The entries inside a .mhmat file are of the form:
 
Comments can be included in c-style line comment format, like
 
or in python-style line comment format, like
 
 
The properties of a makehuman material are numerous so that they can cover the most common needs of exporters and rendering engines, and may vary among the following.
 
name- The internal name of the material.
 
Colors- Their value can be in R G B format, like:colorProperty R G B,where R, G, B are floating point numbers between 0 and 1. They control the color of various material properties and effects. To adjust the intensity of an effect, you can simply edit its color value.
 
ambientColor -The ambient component of the material, which is usually added on the final color regardless of the surrounding lights.
 
  diffuseColor -The diffuse component of the material, which reacts to each light's relative position.
 
  specularColor -The color of the specular shine that is caused by the reflection of a light source on the material.
 
  emissiveColor- The color of the light that the material seems to emit like if it's a glowing light source.
 
  shininess- The sharpness of the shine spots created by the reflection of a light source on the material, which is how glossy its surface is.
 
 
 
 
 
==== .mhuv files ====
 
 
This MH internal extension designates UV map files in MakeHuman.  These text-readable files based on Lightwave OBJ syntax. They contain one or two lines of metadata and alternative UV data for the texture object to be applied to the model.
 
 
There does not currently appear to be a development tool for creating UV map files.  They can be created by creating and saving an OBJ file and then renaming it. However, files created in this manner should not be considered fit for the SVN or website because they contain way too much redundant information.  The OBJ files created in this manner not only contain the alternate UV set but also contain all vertices of the original mesh.
 
 
==== .mhp files ====
 
 
This MH internal extension designates pose files.  These files are used to store coordinates for the run and walk poses on the pose tab. The dance1 pose appears to be implemented as a pose file, but it also includes some additional .target files.  The dance1 pose is also associated wih a .bvh file whereas run and walk are not.  In A8 development there have been issues with the dance1 file behaving less well than run and walk. Hopefully, these will be long gone during subsequent development.
 
 
==== .mht files ====
 
 
 
==== This MH internal extension designates a theme (GUI color scheme) for the GUI as rendered by the  pyQt library. ====
 
 
 
==== .target files ====
 
 
This MH internal extension [which lacks the mh… prefix] specifies final extreme positions that can be achieved when moving the interface sliders in one direction or the other.  By definition sliders have a value of zero on the left end and 1.0 on the right end, but there is a software flag to reverse this behavior as desired.  With scripting it may be possible to exceed slider extremes, but this is currently undocumented and should not be built into production code.  There is one TARGET file for each extreme of each movement.  These files are hand-built by MakeHuman artist-developers, and they are the engine of the programs utility. Typically, rightward movement of a slider should increase size, and leftward movement of a slider should decrease size. When ‘left’ and ‘right’ are used as asset names, they should refer to the left and right body part on the model, regardless of the model’s orientation in the viewport.  When slider movement involves asymmetrical change in the x, y, and z direction, it should be programmed so that the sliders operate naturally using the default viewport view for that slider. For example, left movement of the slider need not cause deformation toward the model’s left side.  Such behavior would only be sensible if the default viewport view had the model facing the rear of the viewport (model’s back toward user).
 
Most MakeHuman modules use data from the TARGET files.
 
·        In the apps group, these modules includes: devtests.py, human.py, humanmodifier.py, posemode.py, and warpmodifier.py.
 
·        In the core group, these modules includes: algos3D.py which is the main program that works with .target files after they are compiled into 
 
·        In the apps group, these modules includes:
 
·        In the apps group, these modules includes:
 
=== .proxy files ===
 
This MH internal extension designates proxy meshes.  They are structured text files.  Format and attributes include ...  TODO
 
.qrc file
 
This is a resource file need used when compiling for the MakeHuman GUI which uses the pyQt library.
 
=== .qss file ===
 
This is the text-based file fomat used to store widgets associated with the GUI interface as rendered by the pyQt library
 
Appreciate help from current developers as what needs to go here.  Undoubtedly some confusion or mistakes on my part, but some of it might help orient others. -RWB
 
Glad to do the grunt work, but feedback on needed improvemrnt greatly appreciated :-)
 
---------------------------
 
Also in this section could possibly also loading, saving, exporting formats (but these formats should probably be addressed in context below)
 
The basic work done within MakeHuman is saved as a file with an .mhm extension.  The .mhm files can be loaded at the start of a new user session, and return MakeHuman to the state that it was in at the end of the previous session.
 
 
 
 
=== Libraries and build procedures ===
 
With each MakeHuman release, packages are created for MS Windows, Mac OS X, and Debian derivatives for linux. At present there is not an operating system independent package (although the raw repository will work on all platforms). Each platform has special considerations that are addressed by the designated mainainter for that platform.
 
=== Dependencies ===
 
In order to run makehuman and/or build packages the following need to be installed on the system:
 
* Python (Python 2.7 (python 3.x will not work).!LINK!http://www.python.org/download/ -- http://www.python.org/download/!/LINK!)
 
* Python-numpy (!LINK!http://sourceforge.net/projects/numpy/files/NumPy/ -- http://sourceforge.net/projects/numpy/files/NumPy/!/LINK!)
 
* Python-opengl (!LINK!http://pyopengl.sourceforge.net/ -- http://pyopengl.sourceforge.net/!/LINK!)
 
* Python-qt4 (!LINK!http://www.riverbankcomputing.com/software/pyqt -- http://www.riverbankcomputing.com/software/pyqt!/LINK!)
 
* Python-qt4-gl (Package required by some linux distributions only)
 
For linux there are scripts (buildscripts/deb/install_deb_dependencies.bash and buildscripts/rpm/install_rpm_dependencies.bash) which will install all the necessary dependencies but please note that these scripts may not agree with all debian and rpm based distros so it may be worth reading the bash script contents prior to running them.
 
=== Assets ===
 
For obvious reasons, the binary assets (characters, textures, clothes, etc..) are not stored on Mercurial repository, but in two different places:
 
* * Immutable single archive files of the assets of finished versions of MH are stored (disregarding patch version token) on bitbucket (!LINK!https://bitbucket.org/MakeHuman/makehuman/downloads -- https://bitbucket.org/MakeHuman/makehuman/downloads!/LINK!)
 
* Assets still under development are store on Tuxfamily ftp:!LINK!http://download.tuxfamily.org/makehuman/assets/ -- http://download.tuxfamily.org/makehuman/assets/!/LINK!
 
When a new version is released, the current ftp asset tree gets zipped and uploaded to bitbucket. In the tuxfamily asset store, the folder for that version's contents are replaced with an archive_url.txt file that points the download script to the right url.
 
A new folder gets created on the tuxfamily host containing the assets to be included in the next release. For a released version, assets will usually not change, unless to fix bugs (in which case the previous archive is best kept, and a new one is created alongside it with a slightly different name), and the archive_url.txt file is updated.
 
To download automatically the assets and place them in the correct folder on a local repository, it's sufficient to run the python script with the command:
 
=== Other scripts ===
 
In the makehuman root, there are a few scripts which need to be run:
 
* cleannpz.sh (cleannpz.bat)
 
* cleanpyc.sh (cleanpyc.bat)
 
* compile_targets.py
 
* compile_models.py
 
* compile_proxies.py
 
Note that all these are called automatically in the correct order by the "build_prepare.py" script in "buildscripts" when running one of the pre-existing package building routines below. However, when running makehuman from a source checkout, you will need to run these scripts in the above order.
 
=== General proceedings for making a package ===
 
On all platforms the following steps should be taken when making the package:
 
* Update from HG repository (!LINK!https://bitbucket.org/MakeHuman/makehuman -- https://bitbucket.org/MakeHuman/makehuman!/LINK!)
 
* Run the scripts in the makehuman root directory. These will for example compile target files to npz files
 
* Copy all that's relevant to a work directory, excluding at least *.target, the utils directory and the tools directory (strictly speaking there are more that is not necessary, but it will have a minor influence)
 
* Do platform-dependent stuff to the work directory
 
* Zip the directory into a suitable installable format for the platform.
 
 
==== Linux ====
 
 
=== Debian ===
 
In order for makehuman to work on Debian the following dependencies are needed.
 
* python
 
* python-qt4
 
* python-numpy
 
* python-opengl
 
* python-qt4-gl
 
These can be installed by running the command below:
 
On Debian/derivatives (Ubuntu kubuntu etc.) the whole package building is automated through the buildDeb.py script found in the "deb" folder.
 
 
 
For Deb script see:!LINK!https://bitbucket.org/MakeHuman/makehuman/src/tip/buildscripts/deb/?at=default -- https://bitbucket.org/MakeHuman/makehuman/src/tip/buildscripts/deb/?at=d...!/LINK!
 
 
 
To build a deb file, create an empty directory (for example /tmp/deb) and run:
 
 
When the script has finished, the deb file will be available in /tmp/deb/output.There are some settings in the head of the buildDeb.py script for tweaking the output.
 
 
=== Fedora 19 64-biit RPM Packaging ===
 
These instruction have been written for and tested on Fedora 19 64-bit. You will never be able to run the MakeHuman HG version on distros such as RHEL/CentOS 6.4 or earlier, since they do not support python 2.7, not even if you enable RPMForge. The instructions may or may not work on other RPM-based distros.
 
* * Install a Mercurial client (Hg) and clone the!LINK!https://bitbucket.org/MakeHuman/makehuman -- Makehuman BitBucket Repository!/LINK!.
 
* Install required dependencies.
 
  As root, run the bash script for installing the required dependencies.
 
  buildscripts/rpm/install_rpm_dependencies.bash
 
  This script also installs optional but recommended dependencies. If you only want the really required  dependencies, run
 
 
* Run MakeHuman. Change working directory to the root of the makehuman tree:
 
 
 
Then run:
 
You will most likely want to do this as your normal desktop user, not as root.
 
 
==== Windows ====
 
 
* * Install an Mercurial client!LINK!http://tortoisehg.bitbucket.org/download/ -- (Tortoise HG client!/LINK!is a good choice on Windows) and clone the!LINK!https://bitbucket.org/MakeHuman/makehuman -- MakeHuman repository!/LINK!(see!LINK!http://www.makehuman.org/doc/node/development_infrastructure.html -- http://www.makehuman.org/doc/node/development_infrastructure.html!/LINK!)
 
* Install required dependencies
 
  If you are using a 64bit Windows version (only applies to 64-bit computers), you can choose to use either 32-bit python or 64-bit python. Howver it is important that your library dependencies (NumPy, PyOpenGL, and pyQT4) are 32-bit if you use 32-bit Python and 64-bit if you use 64-bit Python.
 
 
 
* Run MakeHuman.
 
  Start a command console (cmd.exe), change directory to the makehuman folder. Then run:
 
 
 
==== Mac OSX ====
 
 
To build MakeHuman for Mac OS X, you should:
 
* * Download the MakeHuman OSX Builder from: !LINK!https://bitbucket.org/MakeHuman/makehuman-osx-builder -- https://bitbucket.org/MakeHuman/makehuman-osx-builder!/LINK!
 
* Follow the included instructions (Instructions.rtf)
 
 
 
 
=== Development infrastructure ===
 
The development of MakeHuman is based on two fundamental tools:
 
* Mercurial (HG), a stable and robust platform for distributed revision control, with the main repo hosted on!LINK!https://bitbucket.org/MakeHuman/makehuman -- Bitbucket!/LINK!
 
* A collaborative software!LINK!http://bugtracker.makehuman.org -- development platform!/LINK!, based on!LINK!http://www.redmine.org/ -- Redmine!/LINK!.
 
 
==== Get the code from BitBucket. ====
 
 
Obtain the code from BitBucket repository is very simple, but you need to have Mercurial installed on your system.
 
It's natively present on Linux systems, while for Microsoft Windows, a good software is TortoiseHG (!LINK!http://tortoisehg.bitbucket.org/ -- http://tortoisehg.bitbucket.org/!/LINK!).
 
To clone the repository to your PC, just use this command:
 
hg clone!LINK!https://bitbucket.org/MakeHuman/makehuman -- https://bitbucket.org/MakeHuman/makehuman!/LINK!
 
With tortoiseHG, you can do it visually, with:
 
right-click --> TortoiseHG --> clone
 
 
==== Using Redmine in MakeHuman development ====
 
 
Redmine is a very powerful tool.
 
[To expand with technical information about the roadmap organizations, the subprojects, and how to use it to accept experimental features]
 
 
 
 
=== Development organization ===
 
!IMAGE!Pictures/mh-hg-infographic01.png!/IMAGE!
 
The Unstable branch is where the development happens. This is our working branch, and so we can refer to it as default branch too.
 
The Stable branch is used for official release only.
 
 
 
It's just for mainteinance of the current official version. It's rarely used, except in the case of very noticeable bugs that will cause a service release. Any bugfix in the Stable must be merged in Unstable.
 
When the Unstable is ready for the release, it's merged in the Stable and a tag (a sort of bookmark) is created on this exact revision.
 
!IMAGE!Pictures/mh-hg-infographic02.png!/IMAGE!
 
Working on experimental clone for new features in early development is a great comfort for the developer.
 
 
 
He doesn't have to keep his  code on the local drive since he  can commit it immediately.
 
 
 
It doesn't  matter if the code does not work yet or breaks the app, it is the private playground of the developer.
 
!IMAGE!Pictures/mh-hg-infographic03.png!/IMAGE!
 
To start up a new feature yet unproven or that is unsure how to implement it exactly, or for a big refactor that might make a lot of problems until solved, we will create experimental repos, cloning the main one. One repo for (large) feature.
 
!IMAGE!Pictures/mh-hg-infographic04.png!/IMAGE!
 
Once the developer has something to show, he can communicate with other team members so  they can:
 
* clone the experimental repo to a new folder on their local disk
 
* test it
 
* give feedback
 
* perhaps even commit fixes
 
Once a feature is proven to work  well enough and accepted by the rest of the team, it can be merged in official unstable branch, where it can be integrated in nighly build version and further finalized (it can now be tested by a broader audience and we will receive bugs reports from them)
 
!IMAGE!Pictures/mh-hg-infographic05.png!/IMAGE!
 
It is also possible that an external programmer, without commit rights on the official repo, make changes on his own personal clone (that he can make public on his own account if he desires).
 
 
 
This is a good way to test new team members, where we can accept changes after verifying them. When we trust a new developer, we can give him direct access to official repo.
 
 
 
=== Getting started ===
 
The best way to start your experiments with MakeHuman code is to clone our official repository and start to modify it.
 
 
 
In order to procced easily. it is recommended you use the tools available at!LINK!https://bitbucket.org -- https://bitbucket.org!/LINK!.
 
The general procedure is:
 
* Go to!LINK!https://bitbucket.org -- https://bitbucket.org!/LINK!and sign up (if you don't already have an account)
 
* Go to !LINK!https://bitbucket.org/MakeHuman/makehuman -- https://bitbucket.org/MakeHuman/makehuman!/LINK!
 
* Click "fork" (it's hidden under "..." in the button menu)
 
!IMAGE!Pictures/fork.jpg!/IMAGE!
 
* Enter a name and description of your choice (the fork will end up on your account, so it doesn't matter what you call it). Use "fork at tip" and don't check "this is a private repository"
 
* You should be redirected to the new repository on bitbucket
 
* clone your new repository to your local harddrive (if you need a primer on how to clone or use bitbucket,!LINK!https://confluence.atlassian.com/display/BITBUCKET/Bitbucket+201+Bitbucket+with+Git+and+Mercurial -- see their tutorial!/LINK!)
 
* Make your code changes locally. Do not make a feature branch, work directly on the default branch.
 
* Commit and push the changes to your repository on bitbucket
 
* Go to your repository on bitbucket and click "pull request" (in the button menu at the top right of the page)
 
* Write a good description for what your changes concern and click "create pull request"
 
Once you have done this, the makehuman team will get a notification that there is an incoming code changes. This will be reviewed and either merged or rejected.
 
If you want to continue working with updates in the future, make sure your fork is up to date with changes in the makehuman repository. Go to your repository, click the "..." button in the button menu and choose "compare", click the compare button on the page you get to. Merge any incoming changes.
 
If your tree is hopelessly out of sync with the makehuman tree, just delete it and make a new fork. Better that than accidentally getting junk in your pull requests.
 
Before starting with any of the above, you might want to read up on how to run makehuman from a source checkout rather than from a binary download. The short story is that you need to install a few dependencies. More information can be found here:!LINK!http://www.makehuman.org/doc/node/libraries_and_build_procedures.html -- http://www.makehuman.org/doc/node/libraries_and_build_procedures.html!/LINK!
 
  
  

Latest revision as of 11:33, 20 May 2016


Application design and Code overview

Structural Organization of MakeHuman

MakeHuman is organized hierarchically. There is a root application (type gui3d.Application) that handles rendering of objects (guicommon.Object). These objects can be added/removed to/from the root application. Objects added to root application are always visible in the canvas.[I'm wondering if this is really true? - There are object on hidden tabs and the skeleton or mesh can be visible or not. Am I confusing the distinction between 'canvass' and 'view'? - RWB] Every added object has an openGL counterpart. mhmain.MHApplication inherits from root application to constructmain application. A view is a visual context. A tab is a view, for example, the main application is a view.[The main application IS a view or the main application HAS a view? "the "main application frame/window is a view?"- RWB] The Root application[you do meanrootand notmain? - RWB]contains Categories. A Category (gui3d.Category) is a specialised view object which contains multple taskiew objects (gui3d.TaskView). Taskview objects are specialised view objects with a panes and tab. In context of MakeHuman interface, Category objects constitute the upper row of tabs while taskview objects constitute the lower row of tabs. Objects (guicommon.Object) can be added/removed to/from a taskview. Objects added to a taskview are only visible if the taskview is visible. For example, a skeleton added to the skeleton chooser is only visible when the skeleton chooser taskview is visible. Objects added to root application[root not main? - RWB]are always visible. Thus objects like human and all proxies such as clothes are added to theapplication,because they should always be visible. Visiblilty can be disabled by setting the visibility flag on objects but an object that is not added to a currently visible taskview, or the application, is not visible, even if its visibility flag is set positive. [Wording needs improving -- 'should always be visible' and a 'visibility flag' are mixed message - RWB] An object can only be added to one context. So an object is either added to one taskview, or the application, not two different taskviews. Apart from root application, there is another application (type qtui.Application) which implements MH gui structures using Qtlibraries. mhmain.MHApplication inherits from this application too for handling of gui content.e General structure in MH can be represented as: !IMAGE!Pictures/mh-inheritance.png!/IMAGE!

==

Basics of event handling in Makehuman: ====

Makehuman is a GUI based, interactive, Qt application in which objects interact by sending messages to each other. The Qt class QEvent encapsulates notion of low level events like mouse events, key press events, action events, etc. A Qt application is event loop-based, which is basically a program structure which allows events to be prioritized, queued and dispatched to application objects. In a Qt based application, there are different sources of events. Some events like key events and mouse events come from window system, while some others originate from within application. When an event occurs, Qt creates an event object to represent it by constructing an instance of the appropriate QEvent subclass, and delivers it to a particular instance of QObject (or one of its subclasses) by calling its event() function. This function does not handle the event itself, but rather, it calls an event handlerbased on the type of event delivered, and sends an acknowlegement based on whether the event was accepted or ignored. QCoreApplication::exec()method enters the main event loop and waits until exit() is called. It is necessary to call this function to start event handling. In MH, it is done in lib.qtui.Applicationwhich inherits fromQApplicationwhich in turn inherits fromQCoreApplication: def start(self):

       self.OnInit()
 
       self.callAsync(self.started)
 
       self.messages.start()
 
       self.exec_()
 
   
 

In MH, event handling falls in two broad categories.

  • Event handling forcontainers(application, categories, taskviews)
  • Event handling forwidgets(contained in containers)

Event handling for containers:

MH is organized hierarchically in the context of containers. At top is the main application (core.mhmain.MHMainApplication). It contains categories (core.gui.Category), which in turn contains taskviews (core.gui.TaskView). In a Qt-based application, there are five different ways of events processing, as listed below:

  • * Reimplementing an event handler function like paintEvent(), mousePressEvent() and so on. This is the most common, easiest, but least powerful approach.
  • ReimplementingQCoreApplication::notify( QObject * receiver, QEvent * event ). This is very powerful, providing complete control; but only one subclass can be active at a time. Qt's event loop and sendEvent() calls use this approach to dispatch events.
  • Installing an event filter onQCoreApplication::instance(). Such an event filter is able to process all events for all widgets. It's just as powerful as reimplementing notify(); furthermore, it's possible to have more than one application-global event filter. Global event filters even see mouse events for disabled widgets. Note that application event filters are only called for objects that live in the main thread.[I believe that MH is single threaded - implications? RWB]
  • ReimplementingQObject::event()(as QWidget does). If you do this you get Tab key presses[<--Explain what is special about TAB and shift-TAB presses ?- RWB], and you get to see the events before any widget-specific event filters.
  • Installing an event filter on the object itself. Such an event filter gets all the events, including Tab and Shift+Tab key press events[<--Explain what is special about TAB and shift-TAB presses ?- RWB], as long as they do not change the focus widget.

In MakeHuman, approaches 2 and 4 are used for extensively for event handling. As part of strategy 2,inlib.qtui.Application (which extendsQApplication), notify has been reimplemented:[I dont understand the implication of this? Expond? -RWB] def notify(self, object, event):

       self.logger_event.debug('notify(%s, %s(%s))', object, event, event.type())
 
       return super(Application, self).notify(object, event)
 
       
 

object is the receiver object. Class implementing notify has to be singleton. [Clarify? - RWB] In MakeHuman,MHApplicationsubclasseslib.qtui.Application.MHApplicationobject is the main application object. It's notify function receives notification from the event loop in the underlying Qt layer about each[container only or widgets too?]event, which is then relayed to receiver object's event method. Thereceiver object's event method is either inherited from QObject or reimplemented. The Receiver object then either sends TRUE or FALSE, as the case may be, to[WHERE??] - RWB lib.qtui.Applicationalso implements an event function, which is called if the receiver object isMHApplication objectitself. Then,lib.qtui.Application.event()checks if its a user-defined event or not,and it is so true is returned, else we call super class's event(). In case of true being returned,Qt dispatches event to receiver object's callEvent function which determines which function to be called on object to handle the event. Called function then handles the event or propagate it to its parent(or to current task if its application object), as may seem fit.

Event handling for widgets

Most of MH widgets are wrappers around Qt widgets. These widgets (defined in module lib.qtgui) inherit from the respective Qt widgets and the Widget class (lib.qtgui.Widget). MH makes use ofsignal and slot mechanismby making the connection between the signal originating from the Qt layer and the corresponding handler function. For example, class TabBase connects the signal 'currentChanged' to its function 'tabChanged' as follows: class TabsBase(Widget):

   def __init__(self):
 
       super(TabsBase, self).__init__()
 
       self.tabBar().setExpanding(False)
 
       self.connect(self, QtCore.SIGNAL('currentChanged(int)'), self.tabChanged)
 
       ......................
 
       
 

Any class inheriting from TabsBase(e.g., lib.qtgui.Tabs or lib.qtgui.TabBar) also gets this facility. So now whenever signal 'currentChanged' is emitted from Qt layer function 'tabChanged' is called. The result is that when a user clicks the mouse on a tab, the 'tabChanged' code will be called, Similarly, MH Slider widget connects various slider operations to its event handling functions as:

...........................

   self.connect(self.slider, QtCore.SIGNAL('sliderMoved(int)'), self._changing)
 
   self.connect(self.slider, QtCore.SIGNAL('valueChanged(int)'), self._changed)
 
   self.connect(self.slider, QtCore.SIGNAL('sliderReleased()'), self._released)
 
   self.connect(self.slider, QtCore.SIGNAL('sliderPressed()'), self._pressed)
 

........................... [-- EDITING/PROOFING ENDS HERE - RWB --]


Overriding vs event decorators:

The View, Category and Application classes inherit from events3d.EventHandler, hence they have callEvent() function. The events that apply to this category are usually application-wide. If user make a change that impacts the whole application, and whole application must know about this change, then best way to do so is to call callEvent() for all the taskviews of the application,as follows: for category in self.categories.itervalues():

   for task in category.tasks:
 
       task.callEvent('onMyEvent', params)

A good example is the event of makehuman's scene changing(core.qtui.MHApplication._sceneChanged). So when callEvent is called on a taskview object and it has onMyEvent implemented,it is being called. In core.gui3d.View too one can see that most of the events (onShow, onMouseDown,...) on the taskview are propagated by default to their parents - the categories. The categories also are views, so the events are propagated again to their parent, the application. If, again, the application or any category has an onMyEvent method, it is executed. There are some events that affect the application, but only a single task at a time,for example onMouseDown event. It only happens to the active taskview

(the others are hidden, so they receive no mouse events). Once the mouse is pressed, an Application.currentTask.callEvent("onMouseDown", event)is issued. This causes the onMouseDown event to be received by the active Taskview, its parent Category, and finally the Application. This call executes any onMouseDown method in these objects. Apart from the Category events described above, there are events used for local purposes. Such events are handled withevent decorators. Suppose we have a new Taskview, FooTaskView. This Taskview has a 'mybar' member variable, which is of type Bar(events3d.eventHandler). The requirement is that when self.mybar executes code, we may want to communicate with its parent to inform it about an event that just happened. So, inside the code for Bar, is located a 'self.callEvent('onBaz', 42)' command. Thus, in the Bar class, we emit a timely event signal and use the event decorator to let the parent Taskview know about the event. In the FooTaskView class, but below the place where self.mybar is created, we add: @self.mybar.mhEvent

def onBaz(event):

   # code on event.
 
   # guess what, event == 42.

When now the self.mybar Bar reaches that callEvent, the above method, located in FooTaskView, will be executed. core.mhmain.MHApplication's human is perfect example of this approach.Human has onMouseDown event handler and we need to relay it to application.

In loadMainGui function of MHApplication, we add decorator to human attribute as: def loadMainGui(self):

       ..............
 
       
 
       @self.selectedHuman.mhEvent
 
       def onMouseDown(event):
 
         if self.tool:
 
           self.selectedGroup = self.getSelectedFaceGroup()
 
           self.tool.callEvent("onMouseDown", event)
 
         else:
 
           self.currentTask.callEvent("onMouseDown", event)
 
       .............    

Application.human.onMouseDown event is caught and it starts its trip from the Human; it is captured by the decorated method located in Application.loadMainGui (this is the place where the method is bound with the event), it is sent to the currentTask, propagated through the parent category, and finally reaches its destination, the Application. A more intense process happens on Human.onChanged. This is emitted when a save/load happpens, so the whole application has to know. So it starts from the human, captured by the decorated method in app, sent to ALL the taskviews in MH, and finally through the categories again to the app.


The MakeHuman Graphical User Interface (GUI):

The MakeHuman GUI is based on the pyQt library which, in turn, is built on the Qt library. Qt is a development framework for the creation of applications and user interfaces for desktop. Important GUI classes in MakeHuman are: lib.qtui.Canvas:This is the class in MakeHuman which takes care of rendering openGL graphics.It inherits from Qt's QGLWidget class which is a widget for rendering OpenGL graphics. QGLWidget provides functionality for displaying OpenGL graphics integrated into a Qt application. It is very simple to use. You inherit from it and use the subclass like any other QWidget, except that you have the choice between using QPainter and standard OpenGL rendering commands. The Canvas class reimplements three functions from parent class to perform openGL tasks:

  • paintGL() - Renders the OpenGL scene. It gets called whenever the widget needs to be updated.
  • resizeGL() - Sets up the OpenGL viewport, projection, etc. Gets called whenever the widget has been resized (and also when it is shown for the first time because all newly created widgets get a resize event automatically).
  • initializeGL() - Sets up the OpenGL rendering context, defines display lists, etc. Gets called once before the first timeresizeGL() orpaintGL() is called.

lib.qtui.Application: This is the foundation class which manages GUI's control flow and main settings. It inherits fromQtGui.QApplicationandevents3d.EventHandler. QApplicationcontains the main event loop, where all events from the window system and other sources are processed and dispatched. It also handles the application's initialization and finalization.Application class holds gui main window(instance of lib.qtui.Frame). Application class receives event notifications from underlying Qt user intrface framework and dispatches them to appropriateuser intrface elements in MakeHuman. GUI architecture inMakeHumanMH can be depicted as follows:

!IMAGE!Pictures/mh-uiarchitecture.png!/IMAGE! 


Application Design Notes from IRC Chat with Thanassis

Thanasis comments on gaining the big picture of MH coding

  • [16:12:57] <Thanasis> I would avoid describing the different parts of code as different entities
  • [16:13:43] <Thanasis> ie. follow the object-oriented paradigm, and avoid thinking who is inherited by who etc.
  • [16:13:48] <Thanasis> more specifically
  • [16:15:03] <Thanasis> I mean, treat an object as a single object
  • [16:15:08] <Thanasis> I'll show an example
  • [16:15:45] <Thanasis> you have the application. Inheritance says it consists of three classes, QApplication, MHApplication, and mhmain.Application (names may differ a bit).
  • [16:15:58] <Thanasis> but it is only one object
  • [16:16:42] <Thanasis> We don't care if the QApplication activates a function in MH application, because it happens inside the same object
  • [16:16:50] <Thanasis> the application
  • [16:17:18] <Thanasis> we only pay attention to the interactions the application object has with other objects
  • [16:17:43] <Doctor_Hell_> but if you need to find a function, you need to follow the inheritage chain...no?
  • [16:18:24] <Thanasis> of course, in the code, yes. but in a diagram it will make it complex
  • [16:21:02] <Thanasis> well, let's go top-down
  • [16:21:29] <Thanasis> we have: QT, App, Tab, Task
  • [16:21:56] <Thanasis> The event starts from QT, and activates a handler in App
  • [16:22:22] <Thanasis> The App sends a new event to the current Task
  • [16:22:46] <Thanasis> The task handles it, and sends a copy to the parent Tab
  • [16:23:13] <Thanasis> And finally the Tab handles it and sends a copy to the App
  • [16:23:35] <Thanasis> which handles it, and stops
  • [Referring to how distrating it is to describe the big picture of MH, Thanasis comments ...]
  • "Distract? no, this does not consume any brain energy"
  • We love your help- the more the bettter -- Thanks [RWB]

Comments Regarding MacroTarget slider processing/handling based on naming conventions:

  • [16:36:24] <Doctor_Hell_> [23:26] <Doctor_Hell_> another complex part to know is the engine coded by Jonas to automatically handel the targets using their name
  • [16:37:58] <Thanasis> uh. yes, that's a tough one too. I think it does, but I don't remember why and how, because it was a long time ago and Jonas changed the human modifier class since then
  • [16:38:18] <Thanasis> he simplified it, actually, but I haven't seen the new version yet
  • [16:38:34] <Doctor_Hell_> where is the code?
  • [16:38:49] <Thanasis> apps/humanmodifier.py, I think
  • [16:39:01] <Doctor_Hell_> Jonas should be very busy recently, since I asked him but he didn't reply me yet.
  • [16:42:37] <Thanasis> um by the way, reading humanmodifier alone won't help a lot. I would suggest starting from the macro plugin, to trace what happens when a slider is created, and when the user moves a slider. It might be easier this way

Difficulties in understanding relationships between folder names / module names and the object structure

  • [16:39:36] <Doctor_Hell_> ANother thing to understand is why we have some code inapps, other incore, other in libs..the logic is not clear for me..
  • [16:39:52] <Thanasis> ah, ignore logic in that part. :)
  • [16:40:20] <Thanasis> as far as I know it is all because of mh history so far
  • [16:40:29] <Thanasis> the way it developed.
  • [16:40:56] <Doctor_Hell_> I know some of it
  • [16:41:24] <Doctor_Hell_> in the early times,it was planned to have a core folder, with all important main files
  • [16:42:03] <Doctor_Hell_> then anapps folder, to contain many application based on the core files: makeHuman, makeANime, MakeToon, etc..
  • [16:42:29] <Doctor_Hell_> butsharedandlibswere added later...I don't know why
  • [16:43:10] <Thanasis>libs, they are classes imported from c++ directly
  • [16:43:48] <Doctor_Hell_> ah ..so only the "shared" folder is the mystery
  • [16:44:04] <Thanasis> shared, they are later classes used by many different parts of the code at the same time
  • [16:44:15] <Thanasis> classes created later*
  • [16:44:28] <Thanasis> ie. material. It's used literally everywhere
  • [16:44:32] <Thanasis> progress too
  • [16:44:58] <Doctor_Hell_> ok..at least now it make sense, thank you

The Doc_Hell Diagram (green and red arrows that even Glynn couln't understand)

                   [Summary - concentrate on green arrows not red arrows to get the big picture]
  • <Doctor_Hell_> this is an example of bad diagram:!LINK!https://www.dropbox.com/s/v3waacufp9pyx1u/mouse_event.png?dl=0 -- https://www.dropbox.com/s/v3waacufp9pyx1u/mouse_event.png?dl=0!/LINK!
  • [16:52:48] <Thanasis> well, in your shape, the actual thing is about the green arrows
  • [16:53:19] <Thanasis> the red arrows could have been implemented in hundreds of different ways
  • [16:53:44] <Thanasis> but the important is their result, the green arrows
  • [16:53:53] <Doctor_Hell_> talking about how to draw the objects?
  • [16:54:26] <Thanasis> well, my idea was, do two shapes
  • [16:54:36] <Thanasis> the first without code
  • [16:55:10] <Thanasis> only showing the 5 steps like I explained above, ie, the green arrows
  • [16:55:28] <Thanasis> the second can explain each step in detail using the red arrows
  • [16:55:50] <Thanasis> but in any case the definition is the following
  • [16:56:04] <Thanasis> an event is, you send the name of a method to someone
  • [16:56:14] <Doctor_Hell_> ok...but also we have the description written by Manish. I'll show you it tomorrow
  • [16:56:21] <Thanasis> and that someone executes that method of theirs
  • [16:56:47] <Thanasis> sure, there are many ways to describe it
  • [16:57:00] <Doctor_Hell_> I hope we can find the best one
  • [16:57:07] <Thanasis> I think that they all match though in some certain key points
  • [16:57:52] <Thanasis> if these are filtered out, the explanation may be more simple

And touching on API ideas ...

  • [16:59:09] <Doctor_Hell_> since the core is too complex for average python programmers
  • [16:59:36] <Thanasis> yes... perhaps the code could be organized in tiers.
  • [17:00:00] <Thanasis> core, classes, plugins, scripts
  • [17:00:27] <Doctor_Hell_> ah you are talking of the current code, not the simplified api
  • [17:00:54] <Doctor_Hell_> yes, it's a good idea..but what group under "core" ?
  • [17:01:25] <Thanasis> theapplication and its interfaces(ie.communication with the system and devices. QT, GL etc.)
  • [17:01:59] <Thanasis> classes are abstract, they use the application by calling its methods
  • [17:02:35] <Thanasis> the application uses the interfaces (QT, GL.these are actually harder than the application andcould be a sepparate tier)
  • [17:02:54] <Thanasis> thepluginsuse the classes to create objects
  • [17:03:26] <Doctor_Hell_> I see some possible confusion because:
  • [17:04:24] <Doctor_Hell_> - not all core modules are into core folder (for example qtui and qtgui are in libs)
  • [17:04:49] <Doctor_Hell_> - Classes are everywhere
  • [17:05:07] <Thanasis> yes, libs folder consists of both core and classes... indeed
  • [17:05:23] <Thanasis> andinterfacestoo
  • [17:06:51] <Thanasis> but in general, this separation exists somehow. I'm not sure how, but the past programmers instinctively created it, perhaps for better manageability
  • [17:08:14] <Thanasis> ie. Human is a class, Material too, Armature (probably making name wrong again), and to function I observe that they use callbacks of Application
  • [17:08:38] <Thanasis> They never use interfaces directly
  • [17:09:46] <Thanasis> Even for drawing the human, it is the Application that will give the Human's object3d to OpenGL, not the Human directly
  • [17:10:31] <Thanasis> and plugins use classes ie. they use the callbacks that Human, Material etc. have
  • [17:11:08] <Doctor_Hell_> Do you mean abstraction classes?
  • [17:13:27] <Doctor_Hell_> Thanasis: about new API:!LINK!http://bugtracker.makehuman.org/issues/534 -- http://bugtracker.makehuman.org/issues/534!/LINK!
  • [17:13:42] <Doctor_Hell_> and!LINK!http://bugtracker.makehuman.org/projects/makehuman/wiki/Abstraction_API_for_plugins -- http://bugtracker.makehuman.org/projects/makehuman/wiki/Abstraction_API_...!/LINK!
  • [17:13:57] <Doctor_Hell_> Feel free to add wished api calls


Q & A with Glynn Clements

Question: For an event when user clicks on a tab, Tabs class method tabChanged is called.

QTabWidget documentation.There is no such method.

 Response: 
 

The tabChanged method is defined for qtgui.TabsBase (and overridden for qtgui.Tabs).

Both qtgui.Tabs and qtgui.TabBar inherit from qtgui.TabsBase.

Tabs inherits from both TabsBase and QTabWidget while TabBar inherits from both TabsBase and QTabBar.

Both QTabWidget and QTabBar define the currentChanged signal, which is emitted whenever the current tab changes.

The TabsBase initialiser connects this signal to the tabChanged method (lib/qtgui.py:109):

class TabsBase(Widget):

def __init__(self):

...

self.connect(self, QtCore.SIGNAL('currentChanged int)'), self.tabChanged)

==============================================

Question: is it true that qtui module: low level module, that handles signals directly from QtCore, QtGui and QtOpenGL?

 Response:
 

qtui defines the classes for the application (QApplication subclass), main window QMainWindow subclass), drawing canvas (QGLWidget subclass), and some associated support classes. In other words, it implements the high-level UI, or at least the aspects which are tied to Qt. Question: is it true that qtgui module: low level module, that define the basic widgets of the GUI as buttons, radiobuttons, etc..

 Response:
 

Yes; qtgui is essentially a compatibility layer between the Qt widget classes and the legacy MH GUI toolkit. The classes provide methods which more closely mimic the behaviour of original gui3d widgets. Qt events are translated to MH events3d events. Question: Is it true that gui3d module: high level module, that define the principal public classes of MakeHuman: Application, Category, TaskView and View.

 Response:
 

This is what's left of the legacy MH GUI toolkit. Before the move to Qt, everything was a "View". Now it's just the high-level containers (Application, Category, Task) which exist as somewhere to put event handlers. Question: What is the purpose of the module guicommon.py? Looking at the module name, it appears to be a sort of library for the GUI functions, but it's a completely different thing. What is it really?

 Response:
 

Object used to be part of gui3d. Its primary function is to be the base class for Human. It's the other main subclass of events3d.EventHandler (the first being gui3d.View).

It's not really accurate to call it a "wrapper" around Object3D (although it can have instances of both module3d.Object3D and object3d.Object3D as members).

One of the main differences between guicommon.Object and module3d.Object3D is that the latter is a single mesh, while the former has up to four meshes (all of type module3d.Object3D): the base (or "seed") mesh, a proxy mesh, a subdivided mesh, and a subdivided proxy mesh.

It [guicommon.py or guicommon.Object ??RWB] also contains an object transformation (location, rotattion, scale) [method?], and optionally a display mesh (object3d.Object3D). As a subclass of EventHandler, it [guicommon.Object ??RWB] provides mouse handlers which propagate the event to the object's view (these are overridden for the Human within MHApplication.loadMainGui). Question: G.app is the singular mhmain.MHApplication instance; the callEvent method (inherited from event3d.EventHandler) will end up calling G.app.onMouseDownCallback. How the function callEvent end up calling G.app.onMouseDownCallback?

 Response:
 

"direction" will be either onMouseDownCallback or onMouseUpCallback. Note the "Callback" suffix.

So typical event flow for clicking on the human model is:

G.app.mainwin.canvas.mouseUpDownEvent ("onMouseDownCallback")

   [Canvas.mouseUpDownEvent]
 

G.app.callEvent("onMouseDownCallback") [EventHandler.callEvent]

G.app.onMouseDownCallback() [gui3d.Application.onMouseDownCallback]

G.app.selectedHuman.callEvent('onMouseDown', event) [EventHandler.callEvent]

G.app.selectedHuman.onMouseDown()

   [local function in MHApplication.loadMainGui]
 

G.app.currentTask.callEvent("onMouseDown", event) [EventHandler.callEvent]

G.app.currentTask.onMouseDown() [View.onMouseDown]

G.app.currentCategory.callEvent("onMouseDown", event) [EventHandler.callEvent]

G.app.currentCategory.onMouseDown()

   [View.onMouseDown]
 

G.app.callEvent("onMouseDown", event) [EventHandler.callEvent]

G.app.onMouseDown() [MHApplication.onMouseDown]

Notes:

   G.app.currentTask.parent == G.app.currentCategory
 
   G.app.currentCategory.parent == G.app

Question: is the eventType string (also called 'direction') originated by QT?

 Response:
 

No. "direction" is just so that code which is common to both mouse press and mouse release events can go into a single method (mouseUpDownEvent) rather than duplicating it in mousePressEvent() and mouseReleaseEvent(). Those functions are identical except for the name of the event passed to callEvent.

Bear in mind that self.callEvent("onMouseDown", event) is almost the same as self.onMouseDown(event), except that callEvent forces a redraw after the event has been dealt with, and has some support for logging and profiling. Once you strip that away, the guts of callEvent() is just:

   method = getattr(self, eventType)
 
   method(event)
 
==========================================================
 Question: Can the modularization of the code can be improved? 
 Response:
 

Modularisation is already quite good. Too much of it can make the code harder to understand.

Currently, we try to avoid importing OpenGL or Qt modules (directly or indirectly) into modules which don't inherently depend upon them (e.g. guicommon.Object imports object3d locally in the attachMesh and detachMesh methods, so it only gets imported if you actually need to render the object).

TODO ACTION. One minor point which I noticed: lib/camera.py imports glmodule solely for the queryDepth() call in convertToWorld2D (the Z coordinate is obtained by retrieving a value from the depth buffer using the X,Y coordinates). This should probably be a local import in that method, so that camera.py could be used in e.g. import/export utilities.

==============================================================
 Question: [Referring to DocHell's red and green diagram (!LINK!https://www.dropbox.com/s/v3waacufp9pyx1u/mouse_event.png?dl=0] -- https://www.dropbox.com/s/v3waacufp9pyx1u/mouse_event.png?dl=0]!/LINK!]
 Response:
 

I have no idea; I can't follow that [diagram]. It would probably help if you simply ignore CallEvent(), and treat object.callEvent('method',event) as just object.method(event).

The two factors which make following the flow slightly complicated are:

Methods aren't always defined within the class used to create an object, but may be inherited from one of its base classes.
 
Event handlers may be dynamically added to an object with the mhEvent method (which is usually invoked using Python's decorator syntax).  This is done for the Human object in MHApplication.loadMainGui():
 

@self.selectedHuman.mhEvent

def onMouseDown(event):

....

If you aren't familiar with the decorator syntax, the above is equivalent to:

def onMouseDown(event):

       ...
 

self.selectedHuman.mhEvent(onMouseDown)

which in turn boils down to:

def onMouseDown(event):

       ...
 

self.selectedHuman.onMouseDown = onMouseDown

=====================================================================
 Question: Could you provide an explanation of what a weak reference is?  How might this impact MakeHuman?
 Response:
 

[Jonas Hauquier ] A weak reference, one that is not accounted for by the garbage collection. Garbage collection removes objects from memory that have no pointers to them anymore. Weakrefs are not counted among those, so if the object has only weakrefs pointing to it, and not real pointers, it will be removed anyway. This is to make sure that object A and B that always point to each other, are not kept in memory just because they keep referencing each other (I believe the garbage collection does not do expensive graph traversals to check what is still referenced by the active context).

[Glynn Clements] Sort of. Python uses both reference-counting (which can't handle circular references) and reachability-based garbage collection (which can). If an object's reference count reaches zero, it will be finalised immediately. But an object with a non-zero reference count can still be finalised if it can be determined that it isn't reachable during a garbage-collection sweep.

However: the garbage collector won't attempt to finalise reference cycles if any of the objects in the cycle have a __del__ method, as it can't determine a safe order in which to execute them. Instead, it adds them to a list, accessible as gc.garbage. Application code can inspect this list, manually finalise the objects (which should break the cycles), then remove them from the list (otherwise their presence in gc.garbage will itself constitute a reference).

It's possible to have the garbage collector report such cases using gc.set_debug(gc.DEBUG_UNCOLLECTABLE).

Only a few MH classes contain __del__ methods. These will only present a problem if the object itself is part of a cycle (i.e. contains references to "complex" objects which themselves contain references which could lead back to the original object).

"Leaf-node" classes such as Texture and Shader shouldn't present a problem unless their creators attach additonal references to them. This may be an issue for e.g. qtgui.Slider and qtgui.RadioButton, as these have __del__ methods and will naturally create circular references. Use of weakref may be advised here. Similarly for queue.Thread. SceneItem in plugins/7_scene_editor.py may also be a problem.

=================================================================
 Question: Could you clarify wherther the following acurately describe the use of decorators?
 

A basic decorator is a function that:

1) Gets a function "A" as argument

2) Modifies the behavior of "A"
 
3) Returns a "decorated" version of "A"

Response: Correct. Question: Topic: Decorators and side-effects. When is a decorator not about generating a function? So, if the above accurately descibes a decorator, consider the code:

@self.selectedHuman.mhEvent

def onMouseDown(event):

This should translate:

onMouseDown = self.selectedHuman.mhEvent(onMouseDown)

One would expect that using mhEvent as decorator, it returns a function, but instead, it returns None:

def mhEvent(self, eventMethod):
 

self.attachEvent(eventMethod.__name__, eventMethod)

How can it be used as decorator?

 Response:
 

Because the returned "function" isn't actually used anywhere. The definition is local to loadMainGui(), which doesn't reference the functions, so the fact that e.g. onMouseDown is None within loadMainGui() doesn't matter. The mhEvent() method is used for its side-effects, not its return value. It attaches an event to the selectedHuman. It is equivalent to:

self.selectedHuman.attachEvent(onMouseDown.__name__, onMouseDown)

==================================================================

Packager's notes

Notes for packagers of Makehuman stables and nightlies.

Packaging RPM's for Suse/Fedora using Open Build Service (OBS)

The Open Build Service is a service formerly known as the Opensuse build service. OBS allows packagers to build packages for several targets. Here a target is a particular OS version E.g. Suse 13.1, Suse 12.3, Fedora 20 , Ubuntu 14.04 and so on. OBS being a build system by itself can be hosted anywhere even on local infrastructure. For MakeHuman packages we use the OBS instance hosted by Novell at build.opensuse.org This document will briefly outline the typical packaging workflow for Suse and Fedora. Both distributions use the RPM format and subsequently adhere to RPM's packaging rules.

  !IMAGE!Pictures/obs-flow-chart-scaled.png!/IMAGE! 

1) Prepare source tarball:

OBS does not allow pulling from the internet once the rpm is being processed into a package so. When creating the source tarball you should assemble it either on your own computer or a server online (even a VPS will do). There are several Important files and scripts which are of interest to us. One is build_prepare.py this assembles the main folder and all the common data which is needed by all OSes be it Mac or LInux distros or windows. This script is essential to run. Then there is build_rpm.py. We borrow some of the initial code in this and use it on our bash script but we do not build the whole RPM because OBS does that. Another third inportant file is build.conf (this stores the configuration used by build_prepare.py to make the source tarball). Using the above scripts/files I prepared a Bash script which is shown below. The above script assumes the following:

  • You have cloned MakeHuman's Mercurial (Hg) repo into your home directory and it is located at ~/makehuman (cloning into ~/ automatically makes the makehuman folder)
  • You have python , numpy and pyQt4 installed (package names may vary from distribution to distribution)
  • You are running on a LInux distribution.
  • You want to prepare a source tarball from Hg tags.
  • $VERS is the latest user specified stable tagged version
  • makehuman-$VERS is your destination folder where all files are copied
  • the final tarball uses bzip compression. How the tarball is compressed is significant as RPM accepts only certain compression algorithms. The resultant file is makehuman-$VERS.tar.bz2 (E.g. makehuman-1.0.2.tar.bz2)
  • You use an FTP server whose file path is located at /var/ftp you can skip this step if the script is run locally by commenting the lines out.

​Explanation of the above script: In the first section we use Hg and extract available tags and ask the user for input on which stable tag he/she wants to build against. In the second section we automatically create the build.conf file which is used to specify if we want a release or development folder. Here we set isRelease=True In the third section we use build.conf along with build_prepare.py this creates a basic non-OS specific folder in home (~/). In the fourth section we copy extra files such as the .desktop file and the makehuman icon file, the shell wrapper for MakeHuman and remove the .bat file which is not needed in linux. Currently there is a bug where command line arguments are not passed by the shell wrapper to the python executable. We fix this by manually editing the file automatically in our bash script before copying it into the target folder to be tarballed. This is moved into the folder created earlier by build_prepare.py . With this we now have have a folder with all the basic requirements needed for packaging. FInally the folder is then tarballed and it is ready for upload to OBS. If you are not using an FTP server then comment out the “mv” command which moves the tarball to /var/ftp in the shell script above. IN some distributions the FTP directory is /srv/ftp

2) Upload the tarball to OBS

The next step is to upload the tarball to OBS. Firstly you need to create a username and password. Then login to build.opensuse.org and create a project. On account creation you get your own home directory. Two interfaces exist to OBS. One is the command-line interface which is available by installing the package "osc" it is available in most distributions. The other one is the Web User interface. Both are equally useful the command line interface is better for doing certain tasks however. OBS also has a feature called “ services”. Several services are available such as services to pull from various sources e.g. FTP servers online, services to tarball the version controlled sources directly (tar_scm). You can make use of these _services (they are defined using XML notation in "_service" files). Read!LINK!”http://en.opensuse.org/openSUSE:Build_Service_Concept_SourceService” -- Source Services!/LINK!; or you can manually upload your tarball through the Web UI or use the "osc" command line utility. To learn how to use the "osc" utility use "man osc" or view the manpage online. I would not recommend using _service files to pull the tarball as they can stop working. It is better to manually push the tarball to OBS. Either by pressing the upload button on the Web UI or placing the tarball in your project home directory created by using the “osc checkout” command. Then typing “osc ar” (add /remove) and finally “ osc commit -m ”put commit description here” “ to push and commit the changes you just made. You can also automate the above process by adding lines to the shell script but It is better to administer osc separately and manually to allow for more control.

3) Prepare the spec file:

A .spec file is a special file which is used to build RPM's. It “contains information required by RPM to build the package, as well as instructions telling RPM to build it. The spec file also dictates exactly what files are a part of the package, and where they should be installed. The RPM specification is very well defined and it is fundamental than any packager develop a strong base in the fundamentals.” (taken from “Maximum RPM”) A very important book detailing the various capabilities of RPM is!LINK!”http://www.rpm.org/max-rpm/” -- Maximum RPM!/LINK!. This is a must read for any packager and will help in understanding the spec file shown below. If you have not read it please bookmark the page and give the book a read. I cannot stress how important it is. In OBS we have 2 main OS's which use RPM's as build targets Fedora and Suse. RPM provides macros to make packaging more standardised and easy. Most Macros are common however some are distribution specific. Below is the spec file used by the Makehuman project. For version 1.0.2. The above .spec file follows the general structure and format of spec files as per the RPM specification. The spec file has been formatted for for OBS by running “osc service localrun format_spec_file” then reading the contents to ensure that nothing went wrong and committing the changes back to OBS. In the above spec file we have split MakeHuman into 2 subpackages and made them depend upon each other to ensure consistency. As you can see there are 2 sources one is the tarball we created with our bash script earlier and there is another file called "makehuman.1" this is a manpage. In the spec file instructions are issued to gzip it and put into the manual directory on install. The manpage (Source101 in the .spec file) is shown below. How man pages are created is out of scope of this document. Man pages use gtroff. There are however ample resources online which can help with this. The .spec file has been commented to provide you with a better understanding of what is done at various stages. Once the .spec file has been updated on OBS it automatically triggers a rebuild duribg that time you can monitor the status of the build process and correct errors in your spec file or sources as they arise.

4)Pushing changes to the official repositories

Once you are satisfied that your package is stable you can make a submit request to submit the package from the Web User Interface or run the command below. In the above command we submit the “makehuman” package to the OpenSuse “graphics” project. After this the package is reviewed by the maintainer of the graphics project and approved or rejected with comments given. You will receive an email after the project has been reviewed. Notes:

  • The .spec file and the manual page will change over time. The above is there for illustration purposes only.
  • You can find copies of the most recent stable .spec file and makehuman.1 manpage!LINK!:https://build.opensuse.org/package/show/graphics/makehuman: -- here!/LINK!. This is the stable version after a "submit request" to the official Suse repositories.