SightLab VR Pro for Vizard streamlines academic VR research experiment creation offering intuitive GUI and Python coding. With over 100 examples and templates, this article highlights standout templates to enhance your VR research. Watch this VR experiment sample video first!
SightLab VR Pro is a versatile academic VR research experiment generator tool for Vizard, designed to streamline the creation of virtual reality (VR) experiments. It offers users the choice of an intuitive graphical user interface (GUI) or Python coding to set up their experiments with ease. With over 100 examples and templates now available, SightLab VR Pro provides a comprehensive library that can be leveraged to enhance and build upon your VR experiments. This article showcases some of the standout examples and templates, illustrating how you can utilize them to expand the functionality and capabilities of your VR research.
All of these academic VR research experiment templates will allow you to leverage all of the built in functionality of SightLab VR Pro which includes:
See here for full list of experiment templates and examples.
Visual Search Templates
This suite includes five distinct Visual Search tasks. Each task can be run as-is or customized to suit your needs. Key features include:
Task 1: Customizable Visual Search
In this task, participants are instructed to locate a specified target object amidst a series of objects. Upon successfully finding the target, an auditory cue is triggered, and the object is highlighted. Post-task, you can access raw data files, interactive session replays, and, if using BIOPAC Acqknowledge, analyze physiological data. Analyze time to first fixation, saccades, time to find target and more.
Offers a focused approach to demonstrating experimental control in a visual search task. This setup is particularly valuable for:
In the 'Visual Search Multiple Objects' task, participants are engaged in an environment where they must discern a target object based on size differences. This task is designed to study participant choices and confidence levels under varying conditions, and provides rich data for analysis.
The 'Visual Search Randomize' task, initially set in an art gallery environment (modifiable as per your needs), offers a dynamic visual search experience. In each trial, participants are immersed in a virtual environment scattered with various objects. Their primary goal is to locate a specific target object. The task stands out for its randomization of object locations in each trial, introducing both consistency and variability in the search challenge.
Follow the sound of hooting pigeons and catch them before they fly away. Shows how to keep score of targets found in a game-like setting.
Stimulus Presentation / Response
Driving Simulators / Examples
See how to integrate vehicles into your VR experiment or VR eye tracking experiment
AI Intelligent Agents and Avatars
Template for an interactive, intelligent AI agent that can be connected to various large language models like GPT-4 and Claude Opus. You can customize the agent's personality, use speech recognition, and leverage high-quality text-to-speech models.
Additionally, you can set up a point and click interactive application that allows you to get information and ask questions about any object in your scene. This is automatically using the objects of interest, which is just a simple checkbox.
Eye Tracker Tests
A collection of tests designed to assess the accuracy of your eye tracker.
Includes:
Virtual Screen
External Application Data Recorder
The External Application Data Recorder allows you to record, save and synchronize sensor data while running external VR applications from any source (SteamVR game, Oculus, Web application, etc.). This uses the Nvidia Shadow Play recording feature to also have the option of recording and saving a video that can then be viewed back and matched up with the data recording.
Gaze Based Interactions
Elements in the scene react to your gaze.
Mixed/Augmented Reality
Virtual Mirror
This demo allows you to view your reflection in a virtual mirror in VR. With features such as avatar swapping, facial expression tracking, body tracking, and customizable environments. Facial Expression data can also be saved to a file and analyzed afterwards.
Hand Grabbing / Grab Events / Physics Based Interactions
Grab and interact with objects using physics. Connect to hand tracking from Meta headsets, Manus VR haptic data gloves and more. Save flexion data from fingers, measure distance between fingers. Also examples for collecting grab and release events, as well as controlling elements in the scene using hand gestures.
More information here for hand tracking with physics example.
More information here for grab events example.
Phobia Exposure / Walk the Plank
Challenge Fear of Heights in the Walk the Plank demo. Capture fixations and heatmaps and send signals to BIOPAC Acqknowledge. Can also be connected with foot tracking as well as run in multi-user.
Charts / Graphs / Condition Comparisons
BioFeedBack / Physiological Measurement Connections
See how physiological data streaming in from BIOPAC Acqknowledge can control an object in the scene. In this case a ball will change in size and color based on the physiological signal.
Additional Experiment Paradigms
Distance Perception, Memory Tasks, Choice Comparison Templates, Optic Flow, Smooth Pursuit and more. Modify using no code or expand upon with built in modules.
There are many more examples included, or you can easily create your own custom VR experiment from scratch with little or even no code with SightLab VR Pro. For more information or to inquire about getting a demo click here or contact sales@worldviz.com.