For questions about SmartBody and usage, please contact:
Ari Shapiro, Ph.D.
SmartBody is a character animation platform originally developed at the USC Institute for Creative Technologies.
SmartBody provides the following capabilities in real time:
- Locomotion (walk, jog, run, turn, strafe, jump, etc.)
- Steering – avoiding obstacles and moving objects
- Object manipulation – reach, grasp, touch , pick up objects
- Lip Syncing – characters can speak with simultaneous lip-sync using text-to-speech or prerecorded audio
- Gazing – robust gazing behavior that incorporates various parts of the body
- Nonverbal behavior – gesturing, head nodding and shaking, eye saccades
- Character physics – ragdolls, pose-based tracking, motion perturbations
- Online retargeting – motion is transferred in real time among characters
- Autorigging and autoskinning – 3D humanoid models can be rigged automatically
- Beahvior transfer – complex behaviors can be automatically transferred to new characters during runtime. As the database of behaviors grows, the capabilities of the characters will also grow.
SmartBody is written in C++ and can be incorporated into most game and simulation engines. We currently have interfaces for the following engines:
SmartBody is a Behavioral Markup Language (BML) realization engine that transforms BML behavior descriptions into realtime animations.
SmartBody runs on Windows, Linux, OSx as well as the iPhone and Android devices. A version of SmartBody for Flash can also be used. All the source code is available for download and is licensed under the LGPLv3 license.
|SmartBody Team Lead||Ari Shapiro, Ph.D.
|SmartBody Team||Andrew W. Feng, Ph.D.|
|Inspiration||Stacy Marsella, Ph.D.|
|Past SmartBody Team Members||Dan Casas, Ph.D.
|Marcelo Kallmann, Ph.D.|
|Major SmartBody Contributors||Ed Fast
Chin Chye (Wayne) Koo
Muhammad Hadziq Bin Kuzairi