By Peter Schlueer, President and Founder of WorldViz.
Believe it or not, our VR technology has fueled scientific research for over two decades, and about every five years a leap in technology redefines what’s possible.
The VIVE Pro Eye is one of these leaps, and at WorldViz it’s our goal to help researchers use and access eye-tracking data as easily as possible and yet retain the essential low-level control and data quality that’s essential for quality science. In keeping with our original mission that VR creation should be accessible to anyone without technical skills, we’re excited to announce the rich support of HTC VIVE Pro Eye within the WorldViz VR software toolkit Vizard.
Being able to explore visual attention in virtual environments is transforming how scientific studies can be conducted. It opens up exciting new research possibilities in fields such as psychology, neuroscience, training, performance assessment, and consumer behavior.
Andy Beall, founder and Chief Scientist at WorldViz, says: “It’s amazing how simple it is now to set up immersive eye-tracking experiments with the VIVE Pro Eye. With Vizard, you then can quickly add eye tracking to your virtual scene, and perform analysis including recording and exporting of your data.”
Vizard’s R&D-focused analysis capabilities have empowered thousands of academic and commercial research labs for over 20 years. Vizard is a general-purpose development environment for scientific VR, allowing researchers and innovators to build precise and complex simulations that connect to the VIVE Pro Eye, CAVEs & Powerwalls, head/hand/eye trackers, and motion capture systems. We also support specialty devices, including biophysiological sensors like EEG, EKG, GSR, and more. With an embedded Python interface, development is straightforward and open, which means you don’t have to be a computer science expert to build applications. You can also tap into a huge Python user community for numerous libraries and utilities.
VR Eye Tracking Analytics Lab
To get your VIVE Pro Eye–based research started quickly, we’ve designed the VR Eye Tracking Analytics Lab, a simple yet powerful tool for setting up eye-tracking experiments in VR that include examples for common eye tracking tasks. The VR Eye Tracking Analytics Lab runs on Vizard, which allows native integration with thousands of Python libraries. Templates include:
- Recording and playback of eye-tracking behavior for “after-action review,” including 3D path review
- Extensive data analytics
- User performance-triggered feedback loops with eye or physiological sensor data
- Precise timing experimental control and device synchronization
- 360 videos and 3D files from a wide array of sources as customizable stimuli
- Recording of gaze direction data, pupil size, fixation timings, and other low-level parameters
- Multi-user environments
- Heat maps
- Access to a support ticketing system for professional users
- …and more
Is now the time to buy a VR headset with built-in eye tracking? If you’re still on the fence about the VIVE Pro Eye, read an in-depth review of VR eye-tracking benefits by WorldViz Chief Scientist Andrew Beall, including a simple Vizard example script that demonstrates how to use run an eye-tracking experiment.