University of Southern CaliforniaUSC
USC ICT TwitterUSC ICT FacebookUSC ICT YouTube

sbdesktop - trying to create a live lipsyncing avatar on a windows desktop as a teachers assistant! | General SmartBody Discussion | Forum

Avatar

Please consider registering
guest

sp_LogInOut Log In sp_Registration Register

Register | Lost password?
Advanced Search

— Forum Scope —




— Match —





— Forum Options —





Minimum search word length is 3 characters - maximum search word length is 84 characters

sp_Feed Topic RSS sp_TopicIcon
sbdesktop - trying to create a live lipsyncing avatar on a windows desktop as a teachers assistant!
July 25, 2014
2:05 pm
Avatar
New Member
Members
Forum Posts: 2
Member Since:
July 25, 2014
sp_UserOfflineSmall Offline

Hi guys - Though I ironically teach computing I am an amateur when it comes to proper programming as I've never had the time to dedicate to fully exploring a wide range of languages.

Anyhow! I am creating a Teachers assistant program at the moment using Java that uses cereproc to ask the students questions/interact with them during the lesson. To take my project to the next level it would be amazing to have a desktop avatar that lip syncs live the audio that is being said. I have already got the tts part of it working using cereproc cloud. Prerendering all the commands is not possible due to the very high number of combinations of student names/questions/objectives etc that are being chosen on the fly based on the lesson/behaviour/progress etc hence why I need a live solution. Near the end of the Smartbody manual it shows sbdesktop with a transparent backed window avatar - just what I'm looking for! The problem is if I even try to run sbdesktop.exe I just get a big red triangle.

Ideally I would just want the avatar to lip sync to the audio coming through the speakers. Unfortunately I don't have a whole lot of cash but I can pay for a couple of hours work if someone can come up with a solution!

Many thanks for your help!

Darren

July 27, 2014
10:37 pm
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

The sbdesktop application is essentially a version of SmartBody running on a transparent window that allows you to see the desktop underneath. It's mostly working, and probably only requires some tweaks to the startup script to serve as a base for what you are looking for. The character will be able to lip sync to speech, but the entire setup would likely take more than a few hours to get it working to the point that you are looking for. I'm sure that someone who understood scripting for games could do this without too much trouble. Alternatively, I'd recommend the Virtual Human Toolkit (vhtoolkit.ict.usc.edu) which consists of SmartBody and a number of other technologies (dialogue management, sensing, etc.) running on the Unity game engine and free for noncommercial use. It might be more end-user ready that SmartBody alone, although if you find someone to help you, the sbdesktop would be a much more  lightweight application.

Ari

August 3, 2014
8:29 pm
Avatar
New Member
Members
Forum Posts: 2
Member Since:
July 25, 2014
sp_UserOfflineSmall Offline

Thanks Ari - I have installed the Virtual Human Toolkit aswell - I think unfortunately both are just beyond my understanding though! For now I am avoiding lip syncing until anyone is able to help or a company produces a product that can do live lip syncing!

August 4, 2014
8:06 pm
Avatar
Admin
Forum Posts: 983
Member Since:
December 1, 2011
sp_UserOfflineSmall Offline

There are some options for 'puppetteering', or mimicing your face on a virtual character: look at faceshift.com

The problem of 'live' lip syncing (in order words, speaking and having a character mouth the words that you say) in real time is still an open research problem. No one to my knowledge has solved that yet.

 

Ari