We’re Closer Than We Think To New VR Models

We’re Closer Than We Think To New VR Models

I think that pretty soon, we’ll be talking about virtual reality 2.0, and virtual reality 1.0.

VR 1.0 will be everything that we’ve seen to date – systems that mostly deal with simulations of sight and hearing, but don’t address those finer details related to other types of human physiology.

VR 2.0 will be systems that account for what scientist are calling the ‘sensory gap’ – that feeling of vertigo or dizziness that accompanies systems that are maybe 90% there, but not fully crafted to offer a true virtual experience, mostly because of the engineering limitations that have accompanied them.

We Experience a Full Range of Sensations

Kim SeungJun

Close-up of the Oculus Go virtual reality headset on a light wooden surface, from Facebook Inc and … [+] the former Oculus Rift company, Dublin, California, August 23, 2018. (Photo by Smith Collection/Gado/Getty Images)

Getty Images

First, we have to understand that the semantics around the industry is changing. People are using the term ‘extended reality’ to talk about all sorts of new platforms and functions.

As you can see on Nvidia‘s blog, extended reality Is an umbrella term used to talk about virtual reality, augmented reality, etc. It’s a kind of catch-all for systems that may or may not obstruct our whole field of vision, may or may not have spatial boundaries, or may have different sets of criteria for replacing the physical world with a virtual one.

What We See, We Don’t Feel

Kim SeungJun

MIT researchers are talking about projects that “generate synthetic actions and sensations” for example – and that makes me think about the potential for a whole new way of thinking about reality.

What does that mean?

Part of what it means is that today’s engineers are trying to micromanage the human response to a virtual environment in a comprehensive way. They’re looking at things like vestibular response, sensory activity and muscle responses, instead of just sight and sound. Presumably, they should also be looking at smell, but that’s something different.

Another part of this is addressing the “sickness” that many people have when experiencing discrepancies in their physical and virtual environments. A recent presentation from Kim SeungJun at CSAIL+IIA showed how this can work.

How Can We Handle Sickness Between What We See and Feel?

Kim SeungJun

In any case, people are spending a lot of attention on this. We have this MIT program centering on extended reality where you can see some of the goals laid out:

“Companies increasingly recognize that extended reality (XR) has the capability to reinvent the way we communicate, experience gaming and other forms of entertainment, and transform industries such as healthcare, real estate, retail, and e-commerce. According to Forbes, XR technologies, including virtual reality (VR) and augmented reality (AR), will be “one of the most transformative tech trends of the next five years.” Organizations of all types will be looking for technology professionals with the knowledge base, vision, and skills to implement XR applications that provide a competitive edge. MIT xPRO’s Virtual Reality and Augmented Reality program is designed to give you a foundational understanding and conversational fluency in XR technologies, along with the ability to consider users’ needs when refining applications or developing new ones.”

Group of young people in technical vocational training with teacher

getty

To me, this really represents groundbreaking approaches to the idea that we’re going to be living in virtual worlds. If we’re going to be replacing the physical world with virtual ones, those other worlds have to be authentic – they have to seem real. They can’t just be virtual reality 1.0 – in many ways, we’ve seen the limits of what we can do with that technology, and now we’re going to the next level.

Follow me on LinkedInCheck out my website

Read More

Leave a Reply

Your email address will not be published. Required fields are marked *