Playing VR is cool, but many people play like a child with VR, nausea, vomiting, plums that want to sour. Today VR Jun teaches you how to play VR safely and completely eliminate motion sickness!

As the VR industry gradually comes into sight, the VR boom is in the early stages. Looking at some of the leading companies in the VR field in China, China's VR hardware is not low compared to international standards. If you look at the "international standard" Oculus VR hardware, you can say it is basically the same. It is undoubtedly a good news for domestic users of general products. The domestic VR team has a high performance-to-price ratio under the premise of maintaining the technical clearance of hardware.

However, even the most cutting-edge VR products still have a deadly "hard" injury: a sense of dizziness is too strong. Many VR experience users said that after using VR products for a period of time, discomfort, nausea, and even vomiting will occur. This has become the biggest stumbling block in the development of VR. Solving the dizziness problem has become a pressing need for VR.

First, why does VR dizzy?
The principle of VR dizzy is actually very simple, because the (VR) picture seen by the eye does not match the (real position) information received from the ear, resulting in an increased burden on the brain and a dizziness.
There are two main aspects to this problem:
On the one hand, it is somewhat similar to the reasoning of 3D dizziness. The reason why 3D special effects are so large that viewers feel dizzy is that 3D movies have rich virtual focal scenes in addition to a large number of real-focus pictures.
The elements in the picture are too rich, like the virtual focal leaves in the depth of field is part of many elements. But when watching a 3D movie, viewers must discard these elements in the virtual scene. Because these elements will refocus the eyeball and repeatedly focus on failure, it is easy to produce 3D dizziness, which is similar to the snow blindness caused by people's eyeballs on snow-capped mountains for a long time without visual focus.

In simple terms, the picture is too realistic. It gives you the feeling of immersiveness. The body thinks that you are doing intense exercise or is in the state of the picture, but actually you are sitting in the seat and there is no movement. This is an instinct for self-protection. Most people can reduce dizziness through exercise, but it is impossible for people to actually move a distance.
On the other hand, the delay of the VR hardware causes the time not to be synchronized. When the person rotates the viewing angle or moves, the speed of the screen rendering cannot keep up. In a full-view screen such as a VR, the delay is caused by dizziness. The biggest problem, the current reduction in delay is the main means of weakening VR vertigo at the moment.

Second, delay - VR in the field of reducing dizziness
An important indicator for choosing a virtual reality device is the delay from rotating the head to rotating the screen.
The picture delay depends to a large extent on the refresh rate of the display. The world's most advanced virtual reality device has a refresh rate of 75Hz, including the famous Oculus DK2 and Shanghai Lexiang Technology's big friends.
75Hz means that the rendering is completed until it is displayed on the screen, and it takes at least 1 second divided by 75 times equals 13.3 milliseconds each time, including the time for security insurance, which is generally a 19.3ms delay. Therefore any false propaganda that claims to delay below 19.3ms is false.
So what constitutes the delay of 19.3ms?

(The production process of 19.3ms delay)
1, first from the head to the sensor to read the data takes about 1ms, if you use world-class sensors, then the sampling rate is 1KHz, which means that you can read a thousand data per second, then each data It is 1ms, which also gives the 1ms delay.
2, then the data needs to be transmitted to the computer via the microcontroller. Because their interfaces are different, it is as if the power plug of the air conditioner cannot be plugged into the socket of a small desk lamp. Some conversion work is required, and the microcontroller is responsible for such conversion. It takes about 1ms from the sensor to the microcontroller. Because the previous data generation takes 1ms, then if the data is not transferred to the microcontroller within 1ms, the subsequent data will be discarded.

3. The next step is to transfer the data to the PC via a USB cable. The USB cable has a very high transfer rate, but it is completely controlled by the Host (that is, the PC). In other words, if the Host does not receive the data from the MCU, the data will be discarded. In the case of HID mode, the Host side will always check whether there is data transmission, and then store the data in memory, so this time is within 1ms. At this point, the data has reached the memory of the PC, and it has gone through all the hardware processes. Due to data bandwidth, communication protocols and other limitations, it takes between 3ms and 4ms, and it is difficult to reduce it.

4, after the completion of the hardware transfer, the software algorithm processing process.
Due to the noise and drift of the analog signal itself, there is a large amount of noise and drift in the data after conversion into a digital signal. Therefore, complex digital signal processing methods are required to filter these noises and drifts. In this way, the 9-axis data from the sensor becomes the quaternion rotation data needed to render the game's head rotation. Processing this data is generally within 1ms. When rendering, just multiply this rotated quaternion by the camera's coordinates to get the viewing direction, which can be used to render the scene. Through a special algorithm (such as Time-warp, the fastest algorithm currently available), the actual displayed image is completed according to the previous data processed image. Thanks to the Time-warp algorithm, we can basically ignore the delay in rendering the scene.
5, when the scene is rendered after the anti-distortion, anti-dispersion and other processing. These processes generally need to consume the GPU for 0.5ms. For security purposes, this time is set to 3ms to ensure that the next frame is ready to be transmitted to the display, that is, before the next vertical synchronization signal, the GPU must be able to make anti-distortion, inverse dispersion.
6. Then it's time to transfer the image to the monitor. As mentioned earlier, according to the 75Hz calculation, then it takes 13.3ms. Is this the end? No, there is time for the monitor to display the image. Since the LCD display is a physical process in which the crystal is controlled to rotate by the electric field, the conventional LCD display takes 15 to 28 ms to respond. The latest OLED technology will reduce this time to microseconds.
Well, let's add up these times, 3ms + 3ms + 13.3ms = 19.3ms. Of course, this is the ideal situation. It is also possible that problems such as CPU performance and USB packet loss may result in such low latency.
Domestic friends and the "International Standard" Oculus products adopt this algorithm, and the minimum delay is exactly the same. Worldwide, the delay of 19.3ms is the most scientific and most reliable delay calculation time.
Of course, Oculus is looking forward to lowering the delay even less. From the above formula, we can see that the main bottleneck is in the delay of 13.3ms. By some special methods can be reduced by half or even less. But this requires the joint efforts of hardware manufacturers, operating systems, and game developers.
At present, it seems that VR hardware to improve delay is the best solution to improve dizziness, all the time required to continue compression can reduce the delay of the screen, from the hardware to solve the problem causing dizziness. Secondly, just like the previous 3D vertigo, the user can adapt VR to be a virtual reality product in the real sense after a period of adaptation.
Source: Lei Feng Network
Virtual Reality headsets allow you to experience an immersive and wonderful world, but when you use it for a few minutes, you may feel nauseous and dizzy, which will cause the charm of the entire experience to disappear. The explanation given by computer imaging experts is that current virtual reality headsets do not simulate natural 3D images of the real scene, which leads to the user’s feeling of speaking earlier.

However, Stanford University computational imaging experts have recently developed a technology that can solve this problem and apply it to head-mounted devices. It mainly uses The Light Field Stereoscope. By stacking two transparent LCD screens with gaps between the two, the user sees the scene with depth information and is closer to the real scene. And because this technology has depth information, the user can see different scene pictures before and after, and can achieve natural linear focusing to form a holographic effect.
For hardware and technology providers: Because the connection between the visual system and the vestibular system is "synchronous", it is almost to the extent of 0ms. As you tilt your head and rotate your head, you will see the screen that tilts your head and rotates your head. When you turn around, you will see the turning picture. At this time, the vestibular feels the movement of your body, and the visual system sees the corresponding picture in milliseconds.
The picture below shows the real world, from your body movement to the eyes to see the picture

The delay is 0ms and in VR, you move from your body to you see when you turn the head, IMU and other sensors feel your movement is 20ms; sensors send data to the CPU to 5ms; CPU calculation and simulation, GPU rendering A total of 30ms; the display shows that the eye sees it to 30ms; ... 20+5+30+30=75ms, so there is a delay.
So, the time difference from your “feeling of the vestibular motion†to the “eyes seeing the corresponding image†is a delay. The final requirement is a delay of less than 20ms.

Source: Know the author / Hu Xiaoer 2.0
Smoke Alarm,Smoke Detector,Smart Smoke Alarm,Wireless Smoke Alarms
ZHEJIANG QIANNA ELECTRIC CO.,LTD , https://www.traner-elec.com