Eye gaze directions

Discussions about MHX2 and other plugins that are developed outside the scope of MakeHuman. Note that bug reports should go to the respective code projects and might go unseen here.

Moderator: joepal

Re: Eye gaze directions

Postby Sord » Tue Jan 31, 2017 1:52 pm

joepal wrote:Not saying this is valid for your situation, but there are other places where bone rotations can be locked.
lockedbone2.png

Aranuvir wrote:Just a little annotation to Joel's post: You could try to switch the transform method, too, from quaternion to some ZXY-Euler.

Thank you very much, Joel and Aranuvir! I didn't know about this panel and it's much more intuitive to turn the eyes there than in the Face Rig panel. With these two hints, I now found a way to turn the eyes in the console. So far, my script looks like this:

Code: Select all
import bpy
import math

# Gaze directions in degrees of aversion
gazeDir = [-9,-3,-1,0,1,2,9]

# Set eye-bone rotation mode to XYZ Euler
bpy.context.object.pose.bones["eye.L"].rotation_mode = 'XYZ'
bpy.context.object.pose.bones["eye.R"].rotation_mode = 'XYZ'

for i in gazeDir:

    # Turn both eyes (converting degrees to radian)
    bpy.context.object.pose.bones["eye.L"].rotation_euler[2] = math.radians(gazeDir[i])
    bpy.context.object.pose.bones["eye.R"].rotation_euler[2] = math.radians(gazeDir[i])
   
    # Render image
    bpy.context.space_data.context = 'RENDER'

    # Save image
    bpy.ops.image.save_as(save_as_render=True, copy=True, filepath="//test_"+str(gazeDir[i])+".png", relative_path=True, show_multiview=False, use_multiview=False)


The only thing that doesn't seem to work is saving the image (not even as a one-liner in the console), giving back
Code: Select all
Traceback (most recent call last):
  File "<blender_console>", line 1, in <module>
  File "C:\Program Files\Blender Foundation\Blender\2.78\scripts\modules\bpy\ops.py", line 189, in __call__
    ret = op_call(self.idname_py(), None, kw)
RuntimeError: Operator bpy.ops.image.save_as.poll() failed, context is incorrect
Sord
 
Posts: 7
Joined: Mon Jan 30, 2017 9:01 am

Re: Eye gaze directions

Postby Aranuvir » Tue Jan 31, 2017 2:22 pm

Perhaps try an absolute path? Just saved a render and saw the save_as command uses an absolute path, though relative_path=True ?? I don't know...

But nice scripting anyway. Maybe, you could post the abstract, when you're done with your experiment :).
Aranuvir
 
Posts: 1314
Joined: Sun Oct 12, 2014 2:12 pm

Re: Eye gaze directions

Postby Sord » Tue Jan 31, 2017 3:57 pm

Thanks so much again to everybody for the help. The loop for gaze directions finally works with the following script! :D

Code: Select all
import bpy
import math

# Gaze directions in degrees
gazeDir = [-9,-3,-1,0,+1,+3,+9]

# Set eye-bone rotation mode to XYZ Euler
bpy.context.object.pose.bones["eye.L"].rotation_mode = 'XYZ'
bpy.context.object.pose.bones["eye.R"].rotation_mode = 'XYZ'

for i in range(len(gazeDir)):

    # Turn both eyes (converting degrees to radian)
    bpy.context.object.pose.bones["eye.L"].rotation_euler[2] = math.radians(gazeDir[i])
    bpy.context.object.pose.bones["eye.R"].rotation_euler[2] = math.radians(gazeDir[i])

    # Render image
    bpy.ops.render.render(use_viewport=True)

    # Save image
    myFilePath = "C:\\Users\\...\\render\\"
    myFileName = "female_model_" + str(gazeDir[i]) + "deg.png"
    bpy.data.images['Render Result'].save_render(filepath=myFilePath+myFileName)


You'll only see the difference in the last two images due to the small size, but it's there. Now I'll try to include the different emotional expressions in the loop.
Attachments
female_angry_overview.png
Result
Sord
 
Posts: 7
Joined: Mon Jan 30, 2017 9:01 am

Re: Eye gaze directions

Postby joepal » Tue Jan 31, 2017 4:22 pm

Care to share what the experiment is about? It looks interesting.
Joel Palmius (LinkedIn)
MakeHuman Infrastructure Manager
http://www.palmius.com/joel
joepal
 
Posts: 4473
Joined: Wed Jun 04, 2008 11:20 am

Re: Eye gaze directions

Postby Sord » Thu Feb 02, 2017 1:00 pm

joepal wrote:Care to share what the experiment is about? It looks interesting.


Hi Joel, the study is going to be about the neural response in certain emotion-related brain areas (measured by functional magnetic resonance imaging) during the perception of different emotional expressions depending on the gaze direction. For example, one would expect that our emotional response (steering our fight-or-flight reactions) to a fearful expression might be stronger if the gaze is averted (signifying indirect threat next to us) whereas the response to an angry expression might be stronger if the gaze is direct (indicating direct threat). There is already quite some literature on that, but we might take a closer look at these phenomena in certain patient populations (e.g., autism or social phobia) later.
Sord
 
Posts: 7
Joined: Mon Jan 30, 2017 9:01 am

Re: Eye gaze directions

Postby joepal » Thu Feb 02, 2017 4:31 pm

Sord wrote:Hi Joel, the study is going to be about the neural response in certain emotion-related brain areas (measured by functional magnetic resonance imaging) during the perception of different emotional expressions depending on the gaze direction. For example, one would expect that our emotional response (steering our fight-or-flight reactions) to a fearful expression might be stronger if the gaze is averted (signifying indirect threat next to us) whereas the response to an angry expression might be stronger if the gaze is direct (indicating direct threat). There is already quite some literature on that, but we might take a closer look at these phenomena in certain patient populations (e.g., autism or social phobia) later.


Cute. Sounds a lot like what the guys at the psychology department here do.

The project I'm currently helping them with is to merge biophysiology measurements, VR (oculus rift) and makehuman to study responses to the emotional expression of an audience in a virtual giving-a-speech situation (ie, what happend if the entire audience suddenly goes from having a neutral facial expression to an angry expression).
Joel Palmius (LinkedIn)
MakeHuman Infrastructure Manager
http://www.palmius.com/joel
joepal
 
Posts: 4473
Joined: Wed Jun 04, 2008 11:20 am

Previous

Return to MHX2 and other plugins developed outside MakeHuman

Who is online

Users browsing this forum: No registered users and 1 guest