Loading, please wait
Performs is designed to visualize and synthesize humanoid animations for customized avatars. You can play animation files or synthesize them in real-time from instructions. Select whether to use the Keyframe mode or the Script mode.
Performs uses retargeting-threejs. Just drag and drop your own .GLB, .GLTF or .BVH files to add your animations and adjust the retargeting configuration in Settings. You can use Animics for generate the animations.
Performs works with BML (Behavior Markup Language) instructions, extended to create cohesive animations in real-time, which are explained in detail here. It also supports SiGML (Signing Gesture Markup Language) and NGT glosses.
This mode allows you to synthesize MF (Manual) and NMF (Non-Manual) features, which is essential for enhancing the realism of sign language animations. Just go to Settings to see the options.
To apply these actions, a configuration file is needed for your character, which can be created in Atelier.
Performs works with any rigged 3D character. Just drag and drop your .GLB or .GLTF or go to Avatars, select Upload yours and put the link of your file.
Use the Atelier tool to configure all the parameters needed for the Script mode. This will generate a configuration .JSON file containing all the needed information.
Choose the avatar, change the colors, setup the background, modify the illumination... and export the configuration to use it in your app! You can include Performs using an iframe or downloading the build version from Github.