University of Southern CaliforniaUSC
USC ICT TwitterUSC ICT FacebookUSC ICT YouTube

Mixing mocap systems and IK | General SmartBody Discussion | Forum

Avatar

Please consider registering
guest

sp_LogInOut Log In sp_Registration Register

Register | Lost password?
Advanced Search

— Forum Scope —




— Match —





— Forum Options —





Minimum search word length is 3 characters - maximum search word length is 84 characters

sp_Feed Topic RSS sp_TopicIcon
Mixing mocap systems and IK
April 22, 2015
12:08 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

Hi,

I would like to animate avatars through mocap data and inverse kinematics. The mocap data would be live and come from something like a Razer Hydra, Kinect 2 or infrared tracking systems such as ART or Optitrack solutions.

Some scenarios:

1/ the real user is sat and hold Razer Hydra pads so his hands are both tracked in position and orientation, in 3D, and he uses a HMD so we can also know position/orientation of his head.
In this case, the avatar would also be sat and its hands animated by data received from Razer Hydra, head by data from the HMD.

In order to avoid a stretched body, twisted arms, etc. I think that IK could compute the rest of the body skeleton in order to get a plausible body posture (it is not crucial to get an avatar that matches exactly the real user's body posture, what matters is realism). In addition, perhaps SmartBody can constraint movements to avoid some weird movements I described?

2/ the real user is still sat but his avatar will walk. I would make it walk by using a gamepad or a keyboard.

I think I can achieve this by using locomotion with BML? Of course, I would also have to provide positions for hands and the head in the correct coordinate frame.

3/ the real user applies all his movements to his avatar: his body is fully tracked.

So how to realize the scenarios I gave?

For hands, I first tried to play with BML. I made the avatar try to "touch" a pawn located at a real user's hand. The result was not that bad but the elbow was often at weird positions and too much creased. I have to mention that I did not use any animations.

Now, I would like to go further and be able to set precisely positions/orientations of some body parts. How can I overwrite position/orientation of hands and head?

I dug into the forum and SmartBody source code related to Kinect and then found that a solution might be to use the field "datareceiver_ct" of SBCharacter and call onto it "setLocalPosition", etc.

But I guess that if I overwrite position/orientation of the hand, the rest of the arm and even the chest will remain at wrong positions/orientations. How can I enable IK to automagically get correct positions/orientations? On top of that, I think that I will have to manage retargeting to get an avatar that does not "fly" above the ground?
Then, how are computed locomotion, breathing, saccade, etc. (I see these controllers in sbm_character.hpp)? Are they computed with data received through "datareceiver_ct" or before?

What will happen with gaze if I apply the same treatment to head?

Will locomotion work if I overwrite directly hands or head position/orientation?

April 22, 2015
3:08 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

I tried to call methods onto "datareceiver_ct" directly but I got a link error since its methods are not exported. Instead, I then simply used the setOffset method of SBJoint and was able to move the joint. My problem is now that moving a joint leads to very strange results because there isn't a system in my code to keep a coherent skeleton (arms are stretched). How could I use IK or another mechanism to improve my result?

And as far as I understand, I just modified an offset (and for orientations, I could use pre or post rotations) so it means that I did not find a way to really modify the joint position but I used an additional modification... How to modify position and orientation directly?

April 22, 2015
5:23 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

I think that there's a bug in the following code:

void SbmPawn::setGlobalTransform( SBTransform& newGlobalTransform )
{
    SrMat gmat = globalTransform.gmat();
    setWorldOffset(gmat);
}

As you can see, the variable newGlobalTransform isn't used!

Furthermore, I'm a little bit lost. Here the offset is also the position of pawns. With joints, it seems that the offset is a translation applied in addition to the joint position. Am I right?

April 24, 2015
10:40 pm
Avatar
Admin
Forum Posts: 980
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

Yes, the offset is a matrix that is applied in addition to the joint position.

 

The joint values can be controlled directly; you might want to try to create a controller that sets the quaternion (rotation) and translations directly like this:

class MyController (PythonController):

def init(self, pawn):
# setup rig nodes
print "Dynamic controller..."

def evaluate(self):

   self.setChannelValue("eyebrowsup", val)
   

   rot = SrQuat()

   self.setChannelQuatGlobal('skullbase',rot)

# instantiate this controller once for each character
myc = MyController()
# get the character
character = scene.getCharacter("char0")
# run this controller in position 20 (I can explain more about this)
character.addController(20, myc)

or you can set values into the datareceiver using the old-style commands, here's the documentation:

 

// Usage:
// receiver echo <content>
// receiver enable
// receiver skeleton <skeletonName> <emitterType> position <joint-index/joint-name> <x> <y> <z>
// receiver skeleton <skeletonName> <emitterType> positions <x1> <y1> <z1> <x2> <y2> <z2> ... 24 Joints in total if emitterType == "kinect"
// receiver skeleton <skeletonName> <emitterType> rotation <joint-index/joint-name> <q.w> <q.x> <q.y> <q.z>
// receiver skeleton <skeletonName> <emitterType> rotations <q1.w> <q1.x> <q1.y> <q1.z> <q2.w> <q2.x> <q2.y> <q2.z>... 20 Joints in total if emitterType == "kinect"

The IK that is available is either a full-body IK (based on reaching which you have already experimented with), or using a controller, or using the receiver command.

Ari

May 5, 2015
5:26 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

Hi Ari, thank you for your answer!

Since I'm programming in C++, I tried things a little bit differently and got stuck with addController().

I did not find any function close to the name "addController" in SBCharacter but "add_controller" with the field "ct_tree_p". My problem is that this function is not exported from the SmartBody DLL... Could you push a fix in SVN for that? I guess that adding a "AddController" function to SBPawn that contains "ct_tree_p->add_controller()" would be the solution but does it fits your software design goals?

 

The second solution works but I am not very happy with its design: many strings to build, string parsing involved, etc.

May 5, 2015
5:55 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

Damn! Actually the problem is deeper. I just cannot inherit from SBController because several methods of meController are not exported according to VC 2012:

- remove_all_children()

- init(SbmPawn*)

- start(double)

- stop(double)

- print_state(int)

- print_children(int)

- notify(SBSubject*)

 

Any idea? I really do not want to use Python...

May 11, 2015
5:28 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

I tried to use the "old way" since I am in a dead-end with inheritance and C++, and failed...

 

First, I found the following code pretty odd in "mcu_joint_datareceiver_func" of mcontrol_callbacks.cpp:

            for (std::vector<SmartBody::SBCharacter*>::iterator iter = controlledCharacters.begin();
                 iter != controlledCharacters.end();
                 iter++)
            {
                SmartBody::SBCharacter* character = (*iter);
                SrVec vec(x, y, z);
                SrVec outVec;
                scene->getKinectProcessor()->processRetargetPosition(character->getSkeleton()->getName(),vec, outVec);
                
                if (emitterName == "kinect")
                    character->datareceiver_ct->setGlobalPosition(jName, outVec);
                else
                    character->datareceiver_ct->setLocalPosition(jName, outVec);
            }

You check whether Kinect is used after a call to "getKinectProcessor()->processRetargetPosition()". In addition, you pass "outVec" to setLocalPosition() but "outVec" equals to a null vector...

 

I modified the code that way:

            for (std::vector<SmartBody::SBCharacter*>::iterator iter = controlledCharacters.begin();
                 iter != controlledCharacters.end();
                 iter++)
            {
                SmartBody::SBCharacter* character = (*iter);
                SrVec vec(x, y, z);
                
                if (emitterName == "kinect")
                {
                    SrVec outVec;
                    scene->getKinectProcessor()->processRetargetPosition(character->getSkeleton()->getName(),vec, outVec);

                    character->datareceiver_ct->setGlobalPosition(jName, outVec);
                }
                else
                {
                    character->datareceiver_ct->setLocalPosition(jName, vec);
                }
            }

 

But since I was still unable to change position of a joint (only rotation), I continued to dig into the code and found that "MeCtDataReceiver::controller_evaluate" checks the use of the XPos channel. How do I turn on this channel? I need an exported method because I work in C++.

May 12, 2015
4:45 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

I modified the SK file I was using by adding the following line to "JtWristLf" (because I try to move this joint):

    channel XPos 0 free

 

I've been able to freely move the joint!

 

My problem is now that the result is very strange because the movement does not affect parent joints. In my case, the left forearm and the left arm do not move when I move forward the left wrist.

I clearly don't know what I can do to drive the two avatar's hands with real user gesture but with the rest of the avatar's body posture that is computed. To my opinion, it sounds like computing IK along the two arms but I'm not a specialist of avatar animation and SmartBody provides so many ways to animate that I don't know how to start.

May 12, 2015
11:01 pm
Avatar
Admin
Forum Posts: 980
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

Right; the commands modify the global state of the joint; the assumption is that all the joints are going to be overridden, not just one.

 

If you want to use something like IK you could try to run a BML command that will place a right or left hand into a particular xyz position. Try something like:

 

# create a pawn

mypawn = scene.createPawn("mytarget")

 

# grab the pawn

bml.execBML('mycharacter', '<sbm:reach target="mytarget" sbm:handle="right"/>')

 

...

...

...

# every frame, get the xyz position of the hand

pos = SrVec()

pos.setData(0, 2) # get the x value from Kinect

pos.setData(1, 1.7) # get the y value from Kinect

pos.setData(2, 2.8) # get the z value from Kinect

# every frame, change the position of the target object

mypawn.setPosition(pos)

 

Ari

May 12, 2015
11:07 pm
Avatar
Admin
Forum Posts: 980
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

You have to make sure that the reaching data set is properly attached to the character.

# setup reach
scene.run('BehaviorSetReaching.py')
setupBehaviorSet()
retargetBehaviorSet('mycharacter')

 

If you don't use a reaching behaviorset, the IK will use a simple scheme to position the arm chain that won't look very good.

Ari

May 13, 2015
3:07 pm
Avatar
Paris, France
Member
Members
Forum Posts: 23
Member Since:
August 5, 2014
sp_UserOfflineSmall Offline

I already tried the solution with BML and a pawn that a character's hand will try to touch. The result was pretty good but I couldn't use both hands at the same time.

I continued to dig into the code of SmartBody and I found that the "touch" action of the "reach" BML command executes code of the class MeCtExampleBodyReach. It seems that this class is limited to the manipulation of only one pawn at a time (see the "init" method that takes only one pawn). So is it possible to make the avatar move both hands at the same time?

An example of what I would like to achieve:

Both hands are manipulated and the avatar can walk. You can try a demo there if you have an Oculus Rift and a Razer Hydra: http://serrarens.nl/passervr/f.....nced-demo/

Do you think it is feasible with SmartBody?

May 13, 2015
5:26 pm
Avatar
Admin
Forum Posts: 980
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

Ok, I've checked in changes to the trunk in SVN  (revision 6075) that will allow you to subclass SBController.

It is true that the reach controller takes over the entire body, and is only set up to use a single arm at a time.

As another option, you could use the constraint system (the <sbm:constraint BML command), which should allow you to use both hands independently.

But if you have the tracking data (from a Kinect, for example) then I would try creating a controller directly instead.

In smartbody/lib/kinecttracker there is the original code we were using to track using the Kinect v1, which may be useful if it could be adapted for the kinect v2.

Ari

Forum Timezone: America/Los_Angeles

Most Users Ever Online: 733

Currently Online: mondayneedle1, huberdavidsen3, casey29english, andersen19eaton, kristensen00leblanc, worriedutopia9579, eric506784, t9rrzpi307
109 Guest(s)

Currently Browsing this Page:
1 Guest(s)

Top Posters:

jwwalker: 80

jyambao: 51

rbaral: 47

adiaz: 30

WargnierP: 29

lucky7456969: 28

mbarros: 28

avida.matt: 26

JonathanW: 24

laguerre: 23

Member Stats:

Guest Posters: 65

Members: 52460

Moderators: 3

Admins: 4

Forum Stats:

Groups: 1

Forums: 5

Topics: 427

Posts: 2343