At the Top of VR
One AR/VR product from a Prague based company called VRgineers (pvt), who showed their XTAL 3 Mixed Reality and Virtual Reality headsets at CES, and while both are still in pre-order mode, they have a release date of April 2022, only a few weeks away. These headsets are the next generation following the company’s XTAL 8K headset released in 2020, with much of the design oriented toward the use of VR headset in pilot training. While gaming VR headsets do have to meet demanding specifications, VR headsets used in pilot training are a bit more specific in use but more demanding in terms of their ability to interface with physical cockpit training systems, and in the case of the XTAL 8K and XTAL 3 headsets, they were designed in cooperation with the USAF and the Royal Air Force and interface with a wide variety of cockpit simulation hardware and software, something not part of most VR headset specifications.
These are not your typical $500 headsets, as the older XTAL 8K (currently on sale) model sells for $4,800, while the two newer versions sell for $8,900 and $11,500, so they are really at the top end of the AR/VR universe, but also not out of the range of some of the AR headsets that have been developed for industrial use. That said, they have some interesting features that make them a bit different than other VR headsets and justify the price in the right environment, particularly eye and hand movement tracking and 4K resolution.
A number of VR headsets employ eye tracking, a technique that uses internally mounted cameras to measures the position of a reflection on the cornea of the eye (Fig. 1) (red arrow) against the center of the pupil (blue arrow) and calculates where the user is looking, regardless of head movement. In most VR systems the data is used to move the user’s field of view in game software, so when the user looks to the side, the game view shifts the same way. In the VR headsets mentioned above, not only does the eye tracking data re-locate the FOV, but the data is recorded and used to measure how long it takes a pilot to notice something appearing in the periphery, or how often they look at controls or other external objects. Similar data is collected from controllers that evaluates hand motions and can give insight into how quickly a pilot reacts physically to visual stimuli.
Taking eye tracking out of the aerospace environment and into the Metaverse, eye tracking information can be used to give game developers ways to help you improve your gaming ability. By tracking where you are looking during a game, the eye tracking information can adjust where you are throwing or shooting to more accurately align the shot. But eye tracking information also gives clues as to emotion and reaction to various situations, which is the kind of data that can help data collectors build a more accurate model of you in the Metaverse, although when we say model, we don’t mean your avatar but more things like your level of excitement when viewing a new smartphone or piece of clothing, data that helps them ‘improve the user experience’ or in real terms produce a better selling environment.
This is just one small aspect of why on-line data collectors like Facebook (FB) and Google (GOOG) are excited about and promoting the Metaverse. By increasing the amount of information a user generates, the value of the data is also increased, and while there will be much said about selling virtual real estate and other virtual items that don’t exist in the real world, the game remains the same as it is in the 2 dimensional internet, collect more data and sell it to folks so they can sell more stuff, whether its virtual or physical.