Client: University of Oklahoma
Use Case: Immersive Smart Learning
Built from using Vizard API functions, Professor Ziho Kang at the Univsersity of Oklahoma is currently developing a smart learning application leveraging neuroimaging, eye tracking, and haptic interactions.
The initiative aims to revolutionize educational environments by integrating advanced biometric measures to enhance learning experiences in a fully immersive virtual reality setting. This research delves into innovative methodologies to facilitate non-text-based smart learning, tailored to individual needs, and enabled through MVR technology.
This emerging approach focuses on the real-time analysis of physiological measures such as eye movement characteristics, brain activity, and haptic feedback to understand and improve learning outcomes. The MVR environment creates an interactive space where users can form and manipulate semantic networks, promoting active and collaborative learning.
The MVR semantic network app developed by Professor Ziho Kang enables users to create connections among various objects, such as body parts and food items, demonstrating practical applications of the technology. The system uses Vive Pro Eye headsets and BIOPAC fNIRs devices to measure and analyze physiological responses, providing insights into cognitive and physical engagement.
Future Research:
Ongoing research at the Human Factors and Simulation Laboratory at the University of Oklahoma (Director: Dr. Ziho Kang) includes improving real-time data visualization, exploring correlations among biometric measures, and addressing diverse learning needs based on Universal Design for Learning principles. The lab aims to collaborate with local communities and educational organizations to further develop and distribute the MVR technology for broader educational impacts.
Learn more about creating multi-user eye tracking experiments in this SightLab VR blog post.
Link to Study https://journals.sagepub.com/doi/full/10.1177/21695067231196245#con1