Positional tracking
Positional tracking detects the precise position of the head-mounted displays, controllers, other objects or body parts within Euclidean space. Positional tracking registers the exact position due to recognition of the rotation (pitch, yaw and roll) and recording of the translational movements. Since virtual reality is about emulating and altering reality it's important that we can track accurately how objects (like the head or the hands) move in real life in order to represent them inside VR. Defining the position and orientation of a real object in space is determined with the help of special sensors or markers. Sensors record the signal from the real object when it moves or is moved and transmit the received information to the computer.[citation needed]
Wireless tracking
Wireless tracking uses a set of anchors that are placed around the perimeter of the tracking space and one or more tags that are tracked. This system is similar in concept to GPS, but works both indoors and outdoors. Sometimes referred to as indoor GPS. The tags triangulate their 3D position using the anchors placed around the perimeter. A wireless technology called Ultra Wideband has enabled the position tracking to reach a precision of under 100 mm. By using sensor fusion and high speed algorithms, the tracking precision can reach 5 mm level with update speeds of 200 Hz or 5 ms latency.[1][2][3]
Optical tracking
Optical methods represent a set of computer vision algorithms and tracking devices such as a camera of visible or infrared range, a stereo camera and a depth camera. Optical tracking is based on the same principle as stereoscopic human vision. When a person looks at an object using binocular vision, he is able to define approximately at what distance the object is placed. Not enough just to install a pair of cameras to simulate stereoscopic vision of a person. Cameras have to determine the distance to the object and its position in space, so it's necessary to calibrate. Infants learn to calibrate their vision when they try to take something, correlating the location of the object with outstretched hand. Optical systems are reliable and relatively non-expensive but it's difficult to calibrate. Furthermore, the system requires a direct line of light without occlusions, otherwise we receive wrong data. There are two approaches:
- Inside-out tracking: the camera is based on the tracked device and the infrared markers are placed in stationary locations. This technology is used in Project Tango (SLAM)[4]
- Outside-in tracking: the camera (or several cameras) is placed in a stationary location and the infrared markers are placed on the tracked device. Outside-in approach implies the presence of an external observer (the camera) that determines the position of a moving object by the characteristic points. Technology is usually used in high-end VR systems.
Tracking With Markers
In this method, a target is fitted with markers which form a known pattern. Sources of infrared light (active and passive), the visible markers like QR codes (or they can be circular) typically serve as markers for optical tracking. A camera or multiple cameras constantly seek the markers and then use various algorithms (for example, POSIT algorithm) to extract the position of the object from the markers. Such algorithms have to also contend with missing data in case one or more of the markers is outside the camera view or is temporarily obstructed. Markers can be active or passive. The former are typically infrared lights that periodically flash or glow all the time. By synchronizing the time that they are on with the camera, it is easier to block out other IR lights in the tracking area. The latter are retroreflector which reflect the IR light back towards the source almost without scattering.[5]
Markerless Tracking
It's possible to perform markerless tracking which continuously searches and compares the image with the known 3D model if the geometriс characteristics of the target is known (for instance, from a CAD model).[citation needed]
Inertial Tracking
Inertial tracking use data from accelerometers and gyroscopes. Accelerometers measure linear acceleration. Since the derivative of position with respect to time is velocity and the derivative of velocity is acceleration, the output of the accelerometer could be integrated to find the velocity and then integrated again to find the position relative to some initial point. Gyroscopes measure angular velocity. Angular velocity can be integrated as well to determine angular position relatively to the initial point. Modern inertial measurement units systems (IMU) are based on MEMS technology allows to track the orientation (roll, pitch, yaw) in space with high update rates and minimal latency. But it is hard to rely only on inertial tracking to determine the precise position because dead reckoning leads to drift, so this one is not used in isolation in virtual reality.
Sensor Fusion
Sensor fusion combines data from several tracking algorithms and can yield better outputs than only one technology. One of the variants of sensor fusion is to merge inertial and optical tracking. While optical tracking would be the main tracking method, but when an occlusion occurs, inertial tracking would estimate the position till the objects would visible to the optical camera again. Inertial tracking could also generate position data in-between optical tracking position data because inertial tracking has higher update rate. Optical tracking helps to cope with a drift of inertial tracking.
Acoustic Tracking
An acoustic tracking system is copied from the nature systems of positioning and orientation. Bats or dolphins navigate in space using ultrasound. Acoustic tracking measures the time during which a particular acoustic signal reaches the receiver. There are two ways to determine the position of the object: to measure time-of-flight of the sound wave from the transmitter to the receivers or the phase coherence of the sinusoidal sound wave by receiving the transfer. The main problem with the phase coherence approach is the absolute position of the object is not known but only periodic changes in its location in separate periods of time.
Pros:
- Good accuracy of measurement of coordinates and angles
- Ability to build almost any working area
Cons:
- Requires a direct line of sight between emitters and receivers
- Low speed of ultrasound, which can add latency if the emitters are moving
- Low update rates, unless sensor fusion is used to augment the ultrasound measurements
- Decrease of accuracy due to temperature, atmospheric pressure, and humidity[6]
Magnetic Tracking
Magnetic tracking (or electromagnetic tracking) is based on the same principle as a theremin. It relies on measuring the intensity of inhomogenous magnetic fields with electromagnetic sensors. A base station, often referred to as the system's transmitter or field generator, generates an alternating or a static electromagnetic field, depending on the system's architecture.
To cover all directions in the three dimensional space, three magnetic fields are generated sequentially. The magnetic fields are generated by three electromagnetic coils which are perpendicular to each other. These coils should be put in a small housing mounted on a moving target which position is necessary to track. Current, sequentially passing through the coils, turns them into electromagnets, which allows to determine their position and orientation in space. Magnetic tracking has been implemented by Polhemus and in Razor Hydra by Sixense.[7] The system works poorly near any electrically conductive material, such as metal objects and devices, that can affect an electromagnetic field. Scalable area is limited and can't be bigger than 5 meters.
See also
- 3D pose estimation
- Finger tracking
- Augmented reality
- Head-mounted display
- Indoor positioning system
- Tracking system
- Simultaneous localization and mapping
References
- ↑ "IndoTraq". https://indotraq.com/?page_id=122.
- ↑ "Hands-On With Indotraq". VR Focus. https://www.vrfocus.com/2016/01/hands-on-with-indotraq/.
- ↑ "INDOTRAQ INDOOR TRACKING FOR VIRTUAL REALITY". ABT. https://blog.abt.com/2016/01/ces-2016-indotraq-indoor-tracking-for-virtual-reality/.
- ↑ Dieter Bohn (May 29, 2015). "Slamdance: inside the weird virtual reality of Google's Project Tango". The Verge. https://www.theverge.com/a/sundars-google/project-tango-google-io-2015. Retrieved April 10, 2017.
- ↑ Michael Mehling (February 26, 2006). "Implementation of a Low Cost Marker Based Infrared Optical Tracking System". https://publik.tuwien.ac.at/files/PubDat_210294.pdf.
- ↑ Mazuryk, Tomasz (1996). Virtual Reality History, Applications, Technology and Future. pp. 22–23. https://www.cg.tuwien.ac.at/research/publications/1996/mazuryk-1996-VRH/TR-186-2-96-06Paper.pdf.
- ↑ Tomasz Mazuryk and Michael Gervautz (1996). "Virtual Reality History, Applications, Technology and Future". https://www.cg.tuwien.ac.at/research/publications/1996/mazuryk-1996-VRH/TR-186-2-96-06Paper.pdf.
Bibliography
- Jannick P. Rolland, Yohan Baillot, and Alexei A. Goon. A Survey of Tracking Technology for Virtual Environments. https://www.creol.ucf.edu/Research/Publications/1522.PDF.
- Vikas Kumar N.. Integration of Inertial Navigation System and Global Positioning System Using Kalman Filtering. https://www.vikaskumar.org/ddp/vikas-ddp.pdf.
- J. D. Hol, T. B. Schon, F. Gustafsson, P. J. Slycke. Sensor Fusion for Augmented Reality. https://user.it.uu.se/~thosc112/pubpdf/holsgs2006.pdf.