University of Southern CaliforniaUSC
USC ICT TwitterUSC ICT FacebookUSC ICT YouTube

New Feature Requiests | General SmartBody Discussion | Forum

Avatar

Please consider registering
guest

sp_LogInOut Log In sp_Registration Register

Register | Lost password?
Advanced Search

— Forum Scope —




— Match —





— Forum Options —





Minimum search word length is 3 characters - maximum search word length is 84 characters

sp_Feed Topic RSS sp_TopicIcon
New Feature Requiests
October 5, 2012
4:51 pm
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

Hi,

I've received a number of emails and posts requesting various feature requests. Here are a few of them:

- being able to gesture while walking
- multiple SmartBody scenes running in the same process
- reaching and grabbing with two hands
- ... ?

Please feel free to post additional feature requests on this thread.

Ari

October 5, 2012
6:17 pm
Avatar
Member
Members
Forum Posts: 80
Member Since:
June 13, 2012
sp_UserOfflineSmall Offline

You mentioned before the possibility of a DLL version of smartbody-lib, so that that interface could be used in commercial software and still comply with LGPL. Though I'm not sure that's what you'd call a feature.

October 18, 2012
3:28 pm
Avatar
al2950
Guest
Guests

Hi

I saw your post in the Ogre forums, thought I might add to this list Smile
- walking whilst holding something, eg a gun. (could this be done with gestures or another way I have not realised?)
- physics based locomotion. There are a few papers on this topic, and I think it would really add to your project. For examples of what i mean see;
https://sites.google.com/site/equalstwelve/ & http://www.dgp.toronto.edu/~md.....lasa/slip/

October 18, 2012
5:49 pm
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

The general capability of walking while, say, holding a gun exists in SmartBody. SmartBody uses a hierarchy of controllers that goes from the general (idle posture) to the specific (eye movement) with many layers in between. One of those layers is the gesture layer (which allows you to play an arbitrary animation on top of the existing pose) which could be expanded to inlude upper-body control during locomotion. In general, I'd like to make that aspect more automatic; if you are walking or running, your other behaviors will work, but they will be run after the locomotion is run (so you could, for example, run and point at the same time).

Regarding physics, I spent a lot of time during my PhD working on physics-based movement and control. One of the projects that came out of that was called DANCE, which directly addresses the issues of physics:
http://www.arishapiro.com/dance/

While working on SmartBody, I decided to focus mainly on kinematic motion (motion, blending, procedural techniques) rather than dynamic motion (physics). Having said that, we've included a physics controller that can: 1) create ragdolls, 2) track motion, 3) animate some joints, and simulate others. Within this framework, it is possible to add physics-based locomotion (say, a SIMBICON-like capability) if someone would take the time to do so.

Regards,

Ari Shapiro