hp-omnicept

HP’s G2 Reverb Omnicept edition is a special version of their VR headset targeted for enterprises that includes Tobii eye tracking that can track real-time pupil dilation, eye saccades, visual attention with eye gaze vectors, a camera for facial tracking, as well as photoplethysmogram (PPG) that can measure pulse rate variability and heart rate. Combining these physiological measurements together enables certain human psychological inferences including things like cognitive load that can be correlated to the situational and contextual dimensions of a training scenario, or to measure real-time responses to cognitive effort.

HP lent me G2 Reverb Omnicept edition to try out along with access to OvationVR’s public speaking application as well as a demo of Mimbus’ Virtual Indus electrical training application, which both had integrations of the Omnicept’s cognitive load features. In the absence of having and calibration or real-time graphing features exposed, then I found it hard to correlate the calculated cognitive load numbers from my VR experiences to my personal experiences. The Virtual Indus just gave a minimum and maximum range of cognitive load as well as an average number, and I was able to get my average cognitive load down on the second try of the demo experience. And I wasn’t able to figure out how to get a more granular graphing of cognitive load over the course of the exercise within the VR app (although it looked theoretically possible to do within their Vulcan website). I was able to look at a graph of my cognitive load while give an impromptu speech in OvationVR, but only with slight fluctuations over the course of the talk with a peak value coming at what seemed to be a pretty arbitrary moment.

The challenge with capturing and using this type of physiological data is that sometimes it is really hard for users to see deeper patterns or draw immediate conclusions from these new streams of data, especially in the absence of having any real-time biofeedback to help calibrate and orient to these changes in physiology that may or may not have corresponding changes in your direct experience. I have found this to be a recurring challenge and issue whenever I test out VR experiences that have biofeedback integrated into it. Verifying that it’s accurately calibrated and can provide data that has utility relative to a specific task is the biggest challenge and open question.

It would be nice if HP developed some apps to help users do their own QA testing on each of the sensors, and that provided some real-time graphs to help with this real-time calibration and orientation. Having some canonical reference implementations could also help more developers adopt these technologies, since the success of enterprise platforms like this has a lot to do with how many different Independent Software Vendors (ISVs) implement these sensors into their applications.

I also had a chance to talk with Scott Rawlings, Manager of the HP G2 Reverb Omnicept Platform, Henry Wang, Product Manager for Omnicept, and Erika Siegel, Experimental Psychologist, Research Scientist, Subject Matter Expert on Human Inferences. We talk about the current physiological sensors and what types of human inferences are enabled, how these could be used in different industry verticals including training, education, simulation, wellness, and architecture, engineering, and construction.

Overall, I get that the Omnicept is still within an early and nascent phase of development where ISV developers are still building up the larger XR design and business infrastructure around training use cases within specific industry verticals. In addition to OvationVR and Mimbus’ Virtual Indus, Claria Product Design was mentioned as another company who is shipping support for the Omnicept.

The G2 Reverb is a Windows Mixed Reality headset that still has some quirky workflows. The inside-out tracking has a simpler set up in terms of hardware that needs to be installed, but there’s still some increased complexities with it’s reliance on the Windows Mixed Reality Portal, and how that integrates with Steam. I personally found that it was easier to get the Omnicept to work if launched from Steam first rather than from the Mixed Reality Portal, but this is also more of a reflection of the state of Windows Mixed Reality devices having some technical complexity and quirks that may be dependent upon your computer. There were times when the G2′s room tracking wasn’t as solid as my external lighthouses with my Index, but for these enterprise use cases I was testing it was definitely sufficient overall.

Overall, the HP G2 Reverb Omnicept features access to a lot of physiological data that will eventually also be coming to the consumer market. There’s still many design challenges for translating the potential of these biometric sensors into pragmatic value within the context of an enterprise VR application, but with these challenges come new market opportunities for developers and companies to tap into new ultimate potentials within the medium of VR. The Omnicept edition starts at $1,249, and has been available since May 2021. You can hear more context about the development and potential applications of the Omnicept in my conversation with Rawlings, Wang, & Siegel.

LISTEN TO THIS EPISODE OF THE VOICES OF VR PODCAST

This is a listener-supported podcast through the Voices of VR Patreon.

Music: Fatality

Comments are closed.

Voices of VR Podcast © 2021