How to Create an AI Agent or Avatar of Yourself with Avaturn and SightLab VR Pro
May 8, 2024
Avaturn is a platform for creating realistic and customizable 3D avatars that can be integrated into VR research environments. It leverages generative AI to transform 2D photos into recognizable and realistic 3D avatars.
These avatars are exportable as 3D models, making them compatible with SightLab and Vizard to bring into your VR experiments. The capability to generate lifelike avatars from simple photographs offers a straightforward and efficient method to populate VR simulations with realistic human figures. This can be particularly useful in behavioral research, training simulations, and educational applications where user engagement and realistic interactions are crucial. See this tutorial on how you can create an Avaturn avatar and then use it either as an embodied fully tracked avatar or an intelligent AI agent.
Select either “Use Camera” or “Upload from Device”
If you select “Use Camera” you will be prompted to scan a QR code and take 3 pictures (front and two side)
For uploading from device, it is advised to use one facing front, one left and one right
Once uploaded you can edit and customize your avatar
here you can customize the avatar to your liking
Click through the menu to see all the options
Can choose from the included animations, attach ones from Mixamo, or add more through various workflows. See this tutorial on how to add Mixamo animations. https://docs.avaturn.me/docs/importing/mixamo/
Export you avatar
Click the download button and choose one of the download options. This will download as a .glb file.
Open the .glb file of your avatar in Inspector. You can either adjust lighting on the avatar model itself, or use the scene lighting that it is brought into. Save this avatar somewhere in your SightLab project folder, or in “utils/resources/avatar/full_body” (which is the default location for full body avatars with SightLab 1.10.0).
You can now either see the avatar in the “Mirror” demo in SightLab to see a fully embodied avatar, or add it to the AI Agent application that comes with SightLab to make it into a fully interactive AI agent powered by either Chat GPT or Anthropic. See this page on how to make that happen. Some quick steps below.
For the AI agent, navigate to Chat_GPT_Agent/configs and right click and select “Edit” to edit the configuration for AI_Agent_Config_Avaturn.py in Vizard.
Edit where it says “AVATAR_MODEL” to the name and location of your new avatar
You may also need to adjust some of the other Avatar Configuration Options (most of these should already be set for the Avaturn configuration). For instance if the neck is looking too far down or up, you can adjust the “pitch” (second value) next to “NECK_TWIST_VALUES”. Can also change the Environment next to “ENVIRONMENT”.
Open Notepad and create a new prompt to describe information and aspects of the personality you want for your AI avatar, and adjust the configuration to use this new prompt.
If you want to have the avatar use your voice, you can use something like Eleven Labs voice cloning feature and then select “elevenlabs” in the avatar config file from the previous steps.
For more information on how to integrate this into a VR lab, utilize SightLab VR Pro and Vizard to build experiments or any of WorldViz’s other products, send us a message.