Pixel Perception Problems
- The 10x rule says multiply the pixel pitch by 10 to get the ‘approximate viewing distance’ in feet.
- The Visual Acuity rule says pixel pitch times 3438 gives the viewing distance in millimeters, but requires 20/20 vision
- The Average Comfortable Viewing Distance is a subjective measure that takes into account a number of variables, such as eyesight, content type, and content resolution
But what about VR/AR displays? If any of those rules are used, the pixel pitch of a display that is essentially 0” from your eye would have to have to be microscopic. Doing a calculation for a VR headset like the HTC Vive Flow, the pixel pitch would be 0.01mm (~10um), 2,262 pixels/in, or 5.12m pixels for a 1” display. That’s a lot of pixels in a very small space, which makes the fabrication of such displays a far more complex task than producing displays typically used for most CE devices, but pixel pitch and pixel density are only two of the multitude of factors that go into the ability of a VR headset to accurately portray a realistic image, and a deficiency in almost any of these factors can cause the motion sickness and fatigue problems that plague many potential VR users.
Much research on the root of motion sickness derived from VR comes from US Military studies that looked for reasoning behind the motion sickness that affected many that used flight simulators, particularly those for helicopters. Given that it cost taxpayers $1.1m in 2004 ($1.62m in today’s dollars) to train a pilot, with the cost of simulator training vs. that of a live aircraft being a 1:40 ratio, there was considerable incentive to determine the cause of the problems that many faced during simulator training. Medications were studied with some minor success, but all had side effects that would prevent their use in such situations.
In order for one to perceive motion, there are two visual systems that must work together. The ambient system is used to detect large objects and visual flow in a user’s periphery. The flow of information from this system increases with velocity, detail, and the nearness to the ground, so flying above the clouds in VR provides less information to the ambient visual system than it would if the user were running across a VR field. This behooves the Metaverse designer to maximize detail in order to give the ambient visual system enough data to understand the environment.
The second component is the focal system, which is used for fine detail, size, shape consistency, and perspective, but both should be able to provide information that is consistent with what the user’s brain considers ‘normal’, so when information from either system is inconsistent with the other or with norms, problems occur and are not limited to those who have experienced motion sickness on forms of transportation. Typical motion sickness, with ~30% of study participants experiencing seasickness in moderate seas and as high as 90% in rough seas, shows little correlation to VR issues, but at least in reference to physically caused motion sickness, continued exposure to the stimuli that caused the distress lessens the symptoms and severity over time, although it has been seen that when Navy crewmen are transferred from one type of ship to another, many show motion sickness problems until they adapt to the new environment.
One salient point that also appears in the data is that motion sickness of any kind decreases with age, which is counterintuitive for the Gen Z set, who always say, “I grew up with video games, VR won’t make me sick”. Data collected during WWII showed that soldiers aged 17 to 19 reported seasickness at a 31% rate and those between 30 and 40 saw that drop to13%, but that same study also said that motion sickness of any kind is ‘very rare’ beyond age 50, which to anyone who has gone deep sea fishing on a choppy and cloudy day, knows is not correct.
While most VR brands focus on optics, there is another part of perception that is also a factor in motion sickness or fatigue in the VR space, and that is the vestibular system, a series of canals and organs that reside in the inner ear. The canals give the brain information about angular velocity, particularly the rate of change, acting like a bubble on a level, and the inner ear organs measure the force of gravity and linear acceleration, all of which creates a system for maintaining balance and equilibrium, and while those born without those organs are still able to compensate with other senses, they do not experience motion sickness at all. As the above organs are dependent on gravity for orientation, they work well in a normal environment, but are easily confused by rapid acceleration and a zero gravity environment, giving meaning to the name ‘Vomit Comet’ applied to aircraft that can create a short zero-gravity training setting during rapid directional changes. VR environments that have no gravity reference present conflicting information to the vestibular system that is perceiving normal gravity and little or no acceleration.
All in there are many factors that need to be addressed before we become a society that can live in virtual worlds for extended periods. We have barely touched on other issues facing micro-displays, including Field of View, Motion Tracking and Resolution, all of which must improve to make the virtual world a ‘reality’. Of course, companies involved will spin a Metaverse story that is already happening, but we still have a lot of ground to cover before you can slip on a VR headset and spend hours floating through the rings of Saturn or playing golf with Phil Mickelson on a course you designed last weekend on an island off the coast of Australia..
.