University of Southern CaliforniaUSC
USC ICT TwitterUSC ICT FacebookUSC ICT YouTube

Integration Issues | General SmartBody Discussion | Forum

Avatar

Please consider registering
guest

sp_LogInOut Log In sp_Registration Register

Register | Lost password?
Advanced Search

— Forum Scope —




— Match —





— Forum Options —





Minimum search word length is 3 characters - maximum search word length is 84 characters

sp_Feed Topic RSS sp_TopicIcon
Integration Issues
October 4, 2014
10:13 am
Avatar
Member
Members
Forum Posts: 20
Member Since:
September 11, 2014
sp_UserOfflineSmall Offline

Hello,

 

Considering the integration of SmartBody with a rendering machine i have three general questions:

  1. When creating the character in SB the creation of the skeleton is also necessary? 
  2. In case of gesture posture which are the functions of the re-implementation interface(SBListener) that should be used?
  3. When is the onEvent function being called(under which actions-circumstances)?

Thanks,

Metalos

October 6, 2014
3:16 pm
Avatar
Member
Members
Forum Posts: 20
Member Since:
September 11, 2014
sp_UserOfflineSmall Offline

If you have any ideas on the above please let me know.

thanks

October 6, 2014
11:29 pm
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

1. You always need a skeleton for a character. If you don't specify one, you get a default skeleton with a single joint. If you don't have a skeleton, or if the one you do use doesn't contain certain key joints (head and neck joints for nodding, spine joints for gazing, foot joints for IK) then that functionality will not work.

 

2. Can you elaborate? The <gesture> BML command maps to a set of animations, and includes some coarticulation and pausing capability.

 

3. The OnEvent() function is called whenever some part of the system triggers an event. There are certain built-in events from collision, locomotion, reaching and others. If you (for example) mark events on animations or blends (like a 'footstep' event), then those will be triggered through that function.

 

Ari

October 10, 2014
3:28 pm
Avatar
Member
Members
Forum Posts: 20
Member Since:
September 11, 2014
sp_UserOfflineSmall Offline

For the second bullet.
In case we use a bml instruction for a gesture then some functions of the overwritten interface are going to be called. The set of animations,the coarticullation and the pausing capability are functions that are triggered by the interface and should be implemented by the rendering machine,right?
In the end what i want to know is, if we run a BML instruction, where the SB stops to have interaction and where the rendering machine should take place so the rendering character "listens" to what the bml instruction said and take the appropriate posture/gesture.
Is any example that could help me in the SmartBody distribution of a such situation?

Thanks,
metalos

October 14, 2014
1:16 am
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

The animations and coarticulation are handled by SmartBody, which then alters the state of the character, which can then be displayed by the renderer.

Your renderer should constantly monitor the state of the character (every frame) from SmartBody.

Are you looking for specific events in the behaviors ("end of gesturing", "start of speaking")?

 

Ari

October 15, 2014
8:04 pm
Avatar
Member
Members
Forum Posts: 20
Member Since:
September 11, 2014
sp_UserOfflineSmall Offline

Yes, for the procedure of the gesturing which actions should be taken? Like for the "start of gesturing" and the "end of gesturing" . Can you please elaborate , Ari?

 Any propose an how to make a rendering machine support the actions of moving joints and weights of characters?(by the support of SmartBody as said)

 

Metalos

October 22, 2014
6:39 am
Avatar
Member
Members
Forum Posts: 20
Member Since:
September 11, 2014
sp_UserOfflineSmall Offline

One more thing i would like report is that in this post http://smartbody.ict.usc.edu/f.....ame-engine you are talking about integration but the SBListener interface is nowhere being inferred. How is this accomplished?

October 22, 2014
9:26 pm
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

Are you referring to the SBListener interface? It's in core/ogre-viewer/src/SBListener.h

 

But the main interface that handles the callbacks is in smartbody/core/SmartBody/sb/SBSceneListener.h

 

    SBAPI SBSceneListener() {}
    virtual SBAPI ~SBSceneListener() {}
    virtual SBAPI void OnCharacterCreate( const std::string & name, const std::string & objectClass ) {}
    virtual SBAPI void OnCharacterDelete( const std::string & name ) {}
    virtual SBAPI void OnCharacterUpdate( const std::string & name ) {}

    virtual SBAPI void OnPawnCreate( const std::string & name ) {}
    virtual SBAPI void OnPawnDelete( const std::string & name ) {}
    
    virtual SBAPI void OnViseme( const std::string & name, const std::string & visemeName, const float weight, const float blendTime ) {}
    virtual SBAPI void OnChannel( const std::string & name, const std::string & channelName, const float value) {}
    virtual SBAPI void OnLogMessage( const std::string & message) {}

    virtual SBAPI void OnEvent( const std::string & eventName, const std::string & eventParameters ) {}

    virtual SBAPI void OnObjectCreate(SmartBody::SBObject* object) {}
    virtual SBAPI void OnObjectDelete(SmartBody::SBObject* object) {}
    
    virtual SBAPI void OnSimulationStart() {}
    virtual SBAPI void OnSimulationEnd() {}
    virtual SBAPI void OnSimulationUpdate() {}

 

Ari

October 27, 2014
7:27 am
Avatar
New Member
Members
Forum Posts: 1
Member Since:
October 26, 2014
sp_UserOfflineSmall Offline

Ari Shapiro said
1. You always need a skeleton for a character. If you don't specify one, you get a default skeleton with a single joint. If you don't have a skeleton, or if the one you do use doesn't contain certain key joints (head and neck joints for nodding, spine joints for gazing, foot joints for IK) then that functionality will not work.

 

2. Can you elaborate? The <gesture> BML command maps to a set of animations, and includes some coarticulation and pausing capability.

 

3. The OnEvent() function is called whenever some part of the system triggers an event. There are certain built-in events from collision, locomotion, reaching and others. If you (for example) mark events on animations or blends (like a 'footstep' event), then those will be triggered through that function.

 

Ari

this is getting interesting

October 30, 2014
10:18 am
Avatar
Member
Members
Forum Posts: 20
Member Since:
September 11, 2014
sp_UserOfflineSmall Offline

OK, so more precisely now.
For the actions of the gesture. Which actions should i take from working side of SmartBody(SBListener)?
Can you please number them? It will be very helpful.

Metalos

November 11, 2014
12:47 am
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

The SBListener doesn't respond to the sync points of the gesture.

The sync points are start, ready, prestroke_hold, stroke_start, stroke, stroke_end, postrstroke_hold,  relax, stop.

Ari