Vr metaverse

History of Virtual Reality, Part 3: Welcome to the Metaverse

We’ve talked about many of the strides that have taken Virtual Reality to where it is today.  The infinite dharma of technology has spun from the 19th century to the 21st, and it will continue to spin for years to come.

In the realm of VR, this is great news because it brings us closer and closer to the metaverse.

I think there are some core technological advancements that must be made in order for us to get to a place where we truly have a virtual reality. Obviously, the first required advancement is rendering real-time 3D graphics, but this growing technique is not unique to VR. But there are three key technologies that will elevate VR to where I see it going: optics, tracking, and haptics. Keep in mind, however, that there are amazing technical advancements happening every day.

They’re both behind closed doors and out in the open. I couldn’t possibly know of everything that’s happening, but I’ve got some ideas.

For more history of virtual reality, check out Part 1: Early Technology and Part 2: Digital Entertainment in this series.

Welcome to the Metaverse: Optics

Virtual Reality technology must first focus advancements in the area of optics. Newer headsets like the Vive Pro show improvement in the visual fidelity in consumer VR. Unfortunately, this headset suffers from the limitations of its design. The Vive Pro works by having the user strap a small screen to their face that they look at through lenses to keep the screen in focus.

There are a few problems with this. The single focal point here is the center of the screen. Our eyes will focus to whatever we are looking at, whether it be near or far. In order to solve this problem, we need to track the user’s gaze to decide which part of the 3D-rendered scene we want to bring into focus. There are solutions that already exist for this, although it’s used more as an input method than it is to enhance visual fidelity of VR.

Eye tracking has seen moderate adoption as an accessibility tool. But, given we can shrink it down to fit in a headset, it could make our virtual experiences that much more potent.

Prototyping: Varifocal Display

In the prototyping space, Facebook created what they call the Varifocal display. This display physically moves the screen further away from, or closer to, the user’s face based on what is determined to be the correct focal distance. There are no current plans to add the tech to any commercial product. But, according to Douglas Lanman, Director of Computational Imaging at Oculus Research, the display is “nearly vibration-free” and “nearly silent.”

The currently existing VR Head Mounted Displays more or less take the same approach to the optics. You look through lenses at a small screen inside of the headset. This, of course, limits the user’s field of view to how widely those lenses are able to refract the light into the user’s eyes. A more novel approach to optical design is Leap Motion’s Project North Star, in that the screen is actually reflected into your field of view, augmenting your vision.

Source: LeapMotion Unveiling Project North Star

Combined with depth tracking that they refined in their hand tracking peripheral, the Leap Motion is able to selectively render or obscure geometry from the virtual scene depending on how near or far away a physical object is from where the virtual object should be.

By reflecting the light from the displays onto another surface, you are able to have the image be projected around the viewer’s eyes giving them a more realistic field of view. This potentially solves a number of problems with the current generation of VR optics. This mitigates the “screen door” effect, where the viewer is able to discern the grid of pixels on the display generating the image because the user does not look directly into the diodes of the display.

I find this to be a better solution to making screens with a higher pixel density. It doesn’t require waiting for further technical innovation in another industry, and it doesn’t require more processing power to fill those extra pixels. It creates a model that is more similar to the way light actually works (bouncing off objects into our eyes). This is definitely a clever approach to optics design, but suffers from the pitfall of a bulkier headset.

I couldn’t tell you the precise solution to optics in VR, but I can tell you there are much smarter and more resourceful people on the case than I.

Tracking: Enter XR

Tracking is the next advancement required to usher in the age of XR (X, as in the variable x, can mean Virtual, Augmented, Mixed, or anything else.) In the early days of modern VR, we had very limited tracking. Things were limited to only 3 degrees of freedom. Those 3 degrees involved rotation around the x, y, and z axis centered in the user’s head.

Since then, we’ve entered the world of 6 degrees of freedom (herein referred to as 6dof/3dof). 6dof enables positional tracking of the user’s head. Tracking technology was borrowed from the film industry and the digital entertainment that powered motion capture suits. Sensors were made to look at the user in VR and decide where they had wandered, from a central starting point.

As I mentioned in Part 2, this is called outside-in tracking.

Outside-in tracking vr

 

With the advent of powerful mobile devices in (mostly) everyone’s pocket, XR has the opportunity to change the way we interact with the physical world. Google released the AR app Google Lens, which has a feature that combines image recognition, Google Translate, and AR to translate signage in a foreign language to one that’s natural to the user.

Of course, this has its own hangups. You’ve got to own an ARCore/ARKit capable device, and you’ve got to have access to a data plan somewhere people don’t speak the same language as you. But as time goes on, these requirements won’t be as expensive as they are at the advent of their creation.

Haptics: True Immersion

Arguably one of the largest hurdles to truly immersive VR experiences is also the furthest behind.

While there is a great deal of research being done, consumer haptics devices are a long way away. While things in VR can look very nice, and the audio/visual feedback pulls you into the experience, nothing yanks you out of the feeling of presence quite like your hand or body phasing through an object that your brain has decided exists.

Most consumer VR products that exist today focus on the force-feedback side of haptics, that is, the vibrating motor in the controllers. This form of feedback has been around for a number of years. 5th generation gaming consoles like the Nintendo 64 and Sony Playstation brought this experience into the mainstream. Over the years, the motors used in gaming controllers have shrunk and have become much more dynamic than their original forms.

However, even more sophisticated forms of haptics are in the works. There are two approaches I find most interesting.

Disney Research published a paper in 2013 on their AIREAL project, which relies on air vortex generation. An actuated flexible nozzle enables this technology to provide effective tactile feedback with a 75 degrees field of view, and within an 8.5cm resolution at 1 meter. In a demonstration of the technology, they projected a butterfly onto the arm of a user while one of the AIREAL nozzles applied air in the same location. The collocation of the tactile sensation with the position of the projection gave the user the feeling that the butterfly was physically on their arm. The full setup involved a ceiling-mounted projector along with a depth sensor pointing down at the user from above allowing the air nozzle to enhance the light projections with tactile force.

Can you keep up?

I’m sure as soon as I publish this, a hundred more VR companies will sprout from the ground and begin to shill their new, exciting inventions. We’re living in exciting times with such rapidly evolving technology.

This new digital frontier will hopefully stay to captivate audiences for decades to come, and we have front row seats to see how it changes the world. The mundanities of everyday life will soon be augmented into holographic oblivion, and people from all corners of the planet will be able to convene in the digital world as naturally as if it were in person.

I, for one, am looking forward to the dawn of the metaverse, and I hope to see all of you on the other side.

We're building an AI-powered Product Operations Cloud, leveraging AI in almost every aspect of the software delivery lifecycle. Want to test drive it with us? Join the ProdOps party at ProdOps.ai.