VR has been a growing industry since the early 2020s with the release of faster and more compact headsets that don't require large GPUs and loading CPUs to handle bigger software. However, full-body tracking is still lucrative as ever to obtain. Costing the user hundreds of dollars to obtain it, however much of that is changing with companies looking to make that more viable to the general public. The earliest example of full-body tracking for the public was the 2010 Xbox Kinect.
To achieve this, Xbox Kinect had two important components an IR Depth Sensor, and an IR emitter. These allowed the Xbox to both recognize the depth of the environment and the subject for which to track.
The IR Emitter as its title suggests emits a particle that an IR Depth Sensor could read. These tiny lasers dots as seen above is able to perceive depth from a 2d plane and generate a rough 3d model (ex.b (first picture)). At that time, this technology was revolutionary and such tracking has not been seen to the general public before. However, this was only able to translate a 2d plane of motion, the vertically of the Kinect was unstable, and due to the lack of support for the Kinect it later was discontinued due to its unpopularity.
However, this technology will soon make a rise with the advancements of the consumer market of virtual reality emulation or VR
CGI and movies have always shown us that full-body was possible however its means of doing so was incredibly difficult to replicate for just a home user with basic technologies.
Movie studios use a method called optical positional tracking, cameras are installed or moved along the perimeter of the base and the trackers (white dots) allow the cameras to understand where each position corresponded with movement.
This is then translated to software to the computers software (as shown above) to allow the CGI artists to do their jobs. Yet, this software was too excessive and ridiculously impossible to advertise to the public so this type of tracking will still be reserved for the movie and big software industries.
With this same concept however a different more accessible type of method was developed. Lighthouse Tracking. There have been 5 main core concepts that are important for virtual reality systems. Delay (how fast a system responds to changes), Accuracy (How precisely that response is), Number of objects (How many objects in space can be determined by the system), Coverage area (How large of an area can be tracked), and Sensitivity to the environment (Sensitivity to the system such as external light or to nearby magnetic field distortions). These 5 concepts if achieved can create a world of illusions however a slight distortion of any of these categories is sensed by our brain and can create feelings of nausea and loss of illusion to the world created around them. Then introduces lighthouse tracking.
The lighthouse method is ingenious as it combines the methodology of optical positional and Xbox's Kinect IR emitters to achieve the highest amount of accuracy and precision that can be produced in a civilian household. Vive and Valve have banded together to create one of the most remarkable compact technologies that have come out.
This works with two main items, the base station, and the tracker. The concepts operating these two are simple in nature. The Base station (or lighthouse like the picture above) emits a detectable array of particles that can track the tracked object with the trackers.
These trackers operate with the same versatility as the dots used in the optical positioning, however, the base station allows these to be used to track your movement based on acceleration, positioning, and movement on the 3d plane. The great thing about this method is its modulation, you can increase its accuracy as much as you want by adding more trackers and more base stations to track and emit a more accurate grid across your body or you could get the minimum needed to operate and receive a result of tracking that is still pretty solid.
However, one downgrade is the costs of these materials.
The base stations and headset along with cost along the lines of a thousand dollars. This may be due to the cost to make the base stations as its laser technology is still as expensive as it can get.
However, introducing IMU trackers
IMU trackers can achieve this, a 3d plane positional recognition.
Tracking acceleration and position of its z, x, and y position based on an initial position. With this multiple trackers can be put onto a person to create a rough sample of the person's position on a 3d plane. Companies such as slime VR and tundra have devoted themselves to creating more affordable options for people. Even in the beta versions of these trackers the price tags of these already show drastic differences from the big companies.
In the future, full-body tracking will become more affordable, and different software such as haptic feedback is already in motion. With Facebook's pledge of meta, the VR industry might not be revolutionized by it but it may be spearheaded and allow more industries to shift their research towards VR.
Comments