User comfort

Unlike typical games and apps, VR apps need to consider user comfort as a metric to use to optimize themselves. Dizziness, motion sickness, eye strain, headaches, and even physical injuries from loss of balance have unfortunately been all too common for early VR adopters, and the onus is on us to limit these negative effects for users. In essence, content is just as important to user comfort as the hardware is, and we need to take the matter seriously if we are building for the medium.

Not everyone experiences these issues, and there are a lucky few who have experienced none of them; however, the overwhelming majority of users have reported these problems at one point or another. Also, just because our game doesn't trigger these problems in ourselves when we're testing them doesn't mean they won't trigger them in someone else. In fact, we will be the most biased test subject for our game due to familiarity. Without realizing it, we might start to predict our way around the most nauseating behavior our app generates, making it an unfair test compared to a new user experiencing the same situation. This, unfortunately, raises the costs of VR app development further, as a lot of testing with different unbiased individuals is required if we want to figure out whether our experience will cause discomfort, which may be needed each time we make significant changes that affect motion and frame rate.

There are several things that users can do to improve their VR comfort, such as starting with small sessions and working their way up to get practice in balancing and training their brain to expect the mismatched motion. A more drastic option is to take motion sickness medication or drink a little ginger tea beforehand to settle the stomach. However, we will hardly convince users to try our app if we promise it'll only take a few sessions of motion sickness before it starts to get enjoyable.

There are three main kinds of discomfort that users can experience in VR:

  • Motion sicknessThe first problem, nausea caused by motion sickness, typically happens when there is a sensory disconnect between where the user's eyes think the horizon is and what their other senses are telling their brain, such as the inner ear's sense of balance. 
  • Eye strainThe second problem, eye strain, comes from the fact that the user is staring at a screen mere inches from their eyes, which tends to lead to a lot of eye strain and, ultimately, headaches after prolonged use. 
  • DisorientationFinally, disorientation typically occurs because a user in VR is sometimes standing within a confined space, so if a game features any kind of acceleration-based motion, the user will instinctively try to offset that acceleration by adjusting their balance, which can lead to disorientation, falling over, and the user hurting themselves if we are not careful in ensuring that the user experiences smooth and predictable motion.
Note that the term acceleration is used intentionally since it is a vector, which means it has both magnitude and direction. Any kind of acceleration can cause disorientation, which includes not only accelerating forward, backward, and sideways, but also an acceleration in a rotational fashion (turning around), falling, jumping, and so on.

Another potential problem for VR apps is the possibility of invoking seizures. VR is in the unique position of being able to blast images into the user's eyes at a close range, which opens up some risks that we might unintentionally trigger seizures in vulnerable users if rendering behavior breaks down and starts flickering. These are all things we need to keep in mind during development that need to be tested for and fixed sooner rather than later.

Perhaps the most important performance metric to reach in a VR app is having a high number of frames-per-second (FPS), preferably 90 FPS or more, as this will generate a smooth viewing experience since there will be a very small disconnection between the user's head motion and the motion of the world. Any period of extended frame drops or having an FPS value consistently below this value is likely to cause a lot of problems for our users, making it critical that our application performs well at all times. Also, we should be very careful about how we control the user's viewpoint. We should avoid changing an HMD's field of view ourselves (let the user dictate the direction they are facing), generating acceleration over long periods, or causing uncontrolled world rotation and horizon motion, since these are extremely likely to trigger motion sickness and balance problems for the user.

A strict rule that is not up for debate is that we should never apply any kind of gain, multiplier effect, or acceleration effect to the positional tracking of an HMD in the final build of our product. Doing so for the sake of testing is fine, but if a real user moves their head two inches to the side, then it should feel like it moved the same relative distance inside the application and should stop the moment their head stops. Doing otherwise is not only going to cause a disconnect between where the player's head feels like it should be and where it is, but may also cause some serious discomfort if the camera becomes offset with respect to the player's orientation and the angle of their neck.

It is possible to use acceleration for the motion of the player character, but it should be incredibly short and rapid before the user starts to self-adjust too quickly. It would be wisest to stick to motion that relies on constant velocities and/or teleportation.

Placing banked turns in racing games seems to improve user comfort a great deal since the user naturally tilts their head and adjusts their balance to match the turn.

All of the previous rules apply just as well to 360 video content as they do to VR games. Frankly, there has been an embarrassing number of 360 videos released to the market that are not taking the aforementioned points into account—they feature too many jerking movements, a lack of camera stabilization, manual viewport rotation, and so on. These hacks are often used to ensure the user is facing in the direction we intend; however, we must spend more effort on doing this without hacking to avoid nausea-inducing behavior. Humans are naturally very curious about things that move. If they notice something moving in the corner of their eye, then they will most likely turn to face it. This can be used to great effect to keep the user facing in the direction we intend as they watch the video.

Laziness is not the way to go when generating VR content. Don't just slap a 360 camera on top of a dirt rally car and hack an unexpected camera rotation into the video to keep the action in the center. The motion needs to be smooth and predictable. During production, we need to constantly keep in mind where we expect the user to be looking so that we capture action shots correctly.

Fortunately, for the 360 video format, it seems as though industry-standard frame rates, such as 24 FPS or 29.97 FPS, do not have a disastrous effect on user comfort, but note that this frame rate applies to video playback only. Our rendering FPS is a separate FPS value and dictates how smooth positional head tracking will be. The rendering FPS must always be very high to avoid discomfort (ideally, 90 FPS).

Other problems arise when building VR apps—different HMDs and controllers support different inputs and behavior, making feature-parity across VR platforms difficult. A problem called stereo fighting can occur if we try to merge 2D and 3D content together, where 2D objects appear to be rendering deep inside 3D objects since the eyes can't distinguish the distance correctly. This is typically a big problem for the user interface of VR applications and 360 video playback, which tends to be a series of flat panels superimposed over a 3D background. Stereo fighting does not usually lead to nausea, but it can cause additional eye strain.

Although the effects of discomfort are not quite as pronounced in the AR platform, it's still important not to ignore it. Since AR apps tend to consume a lot of resources, low frame rate applications can cause some discomfort. This is especially true if an AR app makes use of superimposing objects onto a camera image (which is the majority of them), where there will probably be a disconnect in the frame rate between the background camera image and the objects we're superimposing over it. We should try to synchronize these frame rates to limit that disconnect.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.133.144.197