Frequently Asked Questions
- What is SmartBody?
- What is the goal of the SmartBody project?
- Is SmartBody a game engine?
- Couldn’t I just use a game engine to animate characters without using SmartBody?
- On what platforms does SmartBody run?
- How do I download SmartBody and install SmartBody?
- What are the licensing terms?
- Where’s the documentation?
- Will SmartBody run on mobile platforms?
- Can I use my own characters?
- How do I interface with the character?
SmartBody is an animation system for real time characters. It allows characters to playback motion from key-framed or motion captured sources, as well as perform a number of important tasks such as: gazing, talking, facial expression, locomotion, steering, reaching/grabbing/touching, head movements (such as nodding), gesturing, eye saccades, blinking and breathing. In addition, SmartBody characters can be run under physical simulation to simulate ragdoll behavior, as well as to track motion to allow perturbations in their movements.
To provide an extensive set of highly realistic behaviors and capabilities for a real-time character.
No, SmartBody is not a game engine in itself, but it is designed to be incorporated as part of a game engine. There are existing interfaces in SmartBody for Unity, GameBryo, Ogre, Panda3D and Unreal. In addition, SmartBody utilizes a C++ interface, so it could be incorporated into many other game engines as well. SmartBody can be run without using a game engine as well.
Game engines usually offer some character animation capabilities, such as blend trees and transitions, some locomotion, and perhaps ragdolls. However, SmartBody offers a tremendous number of capabilities that would ordinarily need to be customized within the game engine. For example, SmartBody has a very complex gazing controller which allows the character to gaze at various objects with any combination of eyes, neck, check and back. In addition, SmartBody has a built-in model for rapid eye movements (saccades), head movements such as nodding or shaking, automatically synthesizing facial animation to speech and speech synthesis. Also, SmartBody includes a built-in reaching/grabbing controller using examples that allows a character to pick up, touch or place objects in the environment. While these capabilities and others could be programmed within the framework of a game engine, doing so would require a tremendous amount of work. By contrast, a SmartBody character has all these capabilities built into the character and can perform many such tasks automatically.
SmartBody currently runs on Windows, Linux, OsX, Android and iOs (iPhone) platforms. The code is mostly C++, and most libraries that it uses are cross-platform compatible.
SmartBody is currently hosted on SourceForge in source code format here: http://sourceforge.net/projects/smartbody/develop/. You only need to download the trunk/ and not the other branches. Currently, no binaries are available for download, so the code will have to be compiled. Windows users can build the application directly from the source without any additional downloads. Linux and OsX users will need to consult the INSTALL.txt file which contains installation instructions, including additional downloads and builds.
The documentation is currently being composed by the SmartBody team. You can download the latest version from the trunk/ in the file SmartBodyManual.pdf. The documentation for the Python API is online here as well.
Yes, we have built versions of SmartBody for both the iPhone/iOs and Android platforms. Instructions for such builds are located in the smartbody/android and smartbody/ios directories. We have included 3 renderers with each platform: a simple OpenGL-style renderer, an Ogre renderer and a Unity renderer. To use the Unity application, you need to have a Unity Pro license and a Unity mobile license for Android or iOs. This will run exactly how most casino games such as android casino websites and casino mobile apps works using this technology.
Yes, you can use any characters that you choose in SmartBody. To enable certain functionality (such as gazing, head movements, nodding and so forth) you need to tell SmartBody the names of the relevant joints. For example, the gazing controller uses the spine and neck joints, so SmartBody needs to know where those joints are on your character in order to gaze.
You can use a number of different interfaces: C++, Python, or BML (Behavior Markup Language). The C++ and Python interfaces allow you to create characters, modify their attributes, and set up a scene. The BML interface allows you to instruct the characters to perform different tasks and behaviors. For example, to make a character nod his head, you would send the following BML command:
or to make the character to move to location (23,40), you would send:
<locomotion target="23 40">
or to make the character speak, you would send:
<speech type="text/plain">Hello, my name is Brad. How are you?</speech>