The Next 5 Years: Predictions for VR/MR

The world of Virtual (VR) and Mixed Reality (MR) is one marked by constant change— fast-forward one year and you’ll find yourself surrounded by new devices, start ups, platforms and innovative (and many times, conflicting) visions of the future.

While immersive mediums will undoubtedly play a huge role in modern society, trying to understand their future today amidst this instability can be as challenging as it is disorienting. Thankfully, my last two years in the field have given me some strong opinions about the near term future of these platforms. Here’s my take on the next 5 years for VR/MR:


[1] Hands Will Be Standard VR/MR Controllers

The best VR/MR controllers in the world are free, already exist, and are an active part of your body. Every single human innately knows how to use their hands, and their structural complexity allows for a virtually unlimited number of gestures and interactions to be created and detected by software.

Human arms also make a great place to place user interfaces — not only they’re easy to reach, they can also provide users with real time haptic-feedback… courtesy of your own nervous system — partially solving one of the big issues in the world of VR. Push a button on your arm, and it will push you back with equal force.

Live demo of the first version of LeapMotion in VR

Thankfully, hand tracking technology has been going a long way in recent years: Leap Motion is releasing v2 in 2017, offering accurate 180 FoV hand tracking for mobile VR (a first-timer in the industry). Additionally, Mixed Reality devices like Microsoft’s Hololens and Magic Leap’s project are both being built from the ground up with hand tracking in mind, pointing to a promising future for hands as primary means of interaction in immersive mediums.

But this doesn’t mean that controllers will be going away for good —hardware will still have a big role to play in immersive media, but I believe it’ll be a different one (as soon discussed below). Hands are, for all intents and purposes, standardized controllers — almost all human hands work the same way (good interaction design will be critical to address any particular differences). They’re also extremely flexible, intuitive, free and tracking can be gradually improved by software updates and interaction design alone — leading me to believe it’s only a matter of a few years before they get adopted as the industry standard.


[2] Eye Tracking Will Be Central to the Medium

For the past two years, hundreds of different VR headsets have flooded the market mirroring each other’s features —last year, however, one has managed to stand out from the crowd: created by a Tokyo-based startup, FOVE was the first VR headset created with built-in eye tracking. Just a few months after their devkit launch, Oculus proceeded to acquire an eye-tracking startup, eye-tracking devkits are already being made today for the HTC Vive and Google also has acquired several startups of this kind it can use in its VR efforts.

There’s no doubt in my mind that this particular enhancement will play a big role in the second wave of major VR devices — for one thing, it solves a number of big technical challenges, allowing engineers to optimize performance by focusing system resources where the user is currently looking (lowering VR’s high barrier to entry). It can also simulate natural focus realistically and give way to new eye-tracking centric gameplay elements that will open the door to much more immersive experiences.

Virtual characters will finally be able to know when you’re looking at them (and where). User Interfaces designed around this will be incredibly smooth, completely replacing the need for cursors and crosshairs for selection of far away objects. Data-wise, VR/MR creators will have access to an unprecedented level of detail in usage analytics, giving them a grander level of understanding over how users behave in their experiences.

These are just some of the uses this technology will bring — eye tracking is a small feature that does wonders for the medium in terms of usability and immersion, and it will definitely be one of the key aspects in VR/MR that will help set immersive media apart from the rest.


[3] Physical Objects Will Be A Key Part of VR/MR Experiences and Brands

VR/MR is as much about creating digital worlds as it is about blending physical and digital worlds together. Be it from an immersion, interaction or branding standpoint, there are countless opportunities in bringing in custom physical objects you can manipulate and feel to immersive experiences.

The only missing link so far has been the technology — for VR, accurate, cheap and generalized object tracking devices — for MR, accurate 3D tracking of objects through traditional cameras.

Thankfully, strides have been made on both fronts, which have been picking up speed. The recently announced Vive Tracker streamlines the creation of VR peripherals with a (reasonably) small tracker you can apply on almost anything, setting the standard for what’s to come. On the MR side of things, computer vision companies such as Vuforia have been working on reliable 3D object tracking tools for years, some of which are already available today (with native support for AR devices and Microsoft’s HoloLens).

All these trends point to a nearby future where immersive experiences are not just about the creative development of software, but also innovative use of custom hardware (putting the VR/MR industry in a direct collision course with the toy industry as immersive media goes mainstream). We should be expecting a lot of hardware innovation in VR/MR space within the next few years — specially considering how accessible 3D printing has become, which in turn greatly lower the production costs of custom peripherals.


[4] A.I. Will Bring Conversational UI into VR/MR

One of the most pressing questions about immersive mediums is in what ways they are going to converge with other rising technologies. I believe Artificial Intelligence (more specifically, conversational A.I.) will play a key role in VR/MR experiences, fundamentally changing how we interact within its boundaries.

Part of a project of my mine — Convo, a customizable A.I. assistant designed for VR/MR, bringing conversational UI’s to immersive worlds

Conversation-based user interfaces have been expanding prominently through Apple’s Siri, Amazon’s Alexa, countless chatbots and many other alternatives. Design wise, conversational A.I.s are considered by many to be the future of UI, and it’s not hard to see why.

The 2D menus and buttons we interact with on a daily basis are merely abstractions of an implicit conversation — clicking on an app icon essentially means “open this app” and swiping right on Tinder translates into “I like this person” (or simply “Yes!”). As simple as current gestures may feel individually, talking will always feel more intuitive — it is, after all, the user interface humanity functions on.

Combine these ideas with the recent advancements in text-to-speech + voice-to-text + deep learning and A.I. just might have enough momentum to become a powerhouse in immersive environments, turning customized A.I. assistants into an industry standard in any piece of software.

In fact, it might come sooner than you think — Microsoft has recently announced it’ll be opening up Cortana to the developer community in 2017, allowing them to integrate the virtual assistant in anything — be it custom hardware or Virtual and Mixed Reality experiences.


[5] Untethered MR Devices Will Slim Down & Finally Reach The Consumer Market

Mixed Reality is a very promising immersive medium, but it’s still not ready for prime time. The technology is expensive and certain aspects of the UX still lack refinement. it also still looks fairly bulky, which is bad news for a device that aspires to be tomorrow’s everyday computer (and so much more).

But that’s all bound to change within the next 5 years. The HoloLens v3 is scheduled for a 2019 release and it will pack a number of hardware and software improvements (and probably a new name to go with them). I believe MagicLeap will follow a similar path, finally announcing their consumer device within the next two years (especially with all the pressure they’ve been getting). Samsung has publicly stated they’re working on their flagship MR device, and Apple, Google & Snapchat have all been laying the groundwork for a push in that direction through patents and acquisitions.

Although it’s fair to be skeptical of any timelines when it comes to untethered MR, I think 3–5 years is a fair assumption for the first big consumer devices hitting the stores— it’ll be a shaky start, but I’d say things look pretty optimistic in the MR landscape — in the meantime, you can get used to the technology with Microsoft’s new VR/MR hybrid headsets coming out later this year for $299/$699.


[6] VR/MR will become a singular immersive medium

Virtual Reality, Augmented Virtuality, Mixed Reality … If you are here, there’s a chance you’ve heard those terms and many more to describe differences in immersive media. This is mostly a symptom of a new medium trying to figure itself out and the underlying struggle to understand the characteristics that truly set it apart.

But upon close inspection of today’s landscape and what’s coming next, one could say that the technology and design approaches in VR/MR are bound to become increasingly similar until the point they become a singular medium — and you can see symptoms of this shift today.

New VR devices like Intel’s Project Alloy employ some MR features that allow you to see your immediate surroundings with smart use of sensors and cameras. Microsoft’s new VR/MR hybrids allow you to seamlessly switch between both modes with the press of a button, all in the same device. And while the HoloLens has a limited Field of View right now, the HoloTour VR-like demo is living proof that MR can fully block out the world around you if it so desires. VR devices that can do MR and MR devices that can do VR — the collision course has never been more imminent.

VR+MR already share the same development tools, and soon the design and interaction language of both will be pretty much identical (specially as hand tracking becomes the norm). Future immersive experiences won’t live in separate mediums or under distinct labels, but rather on a spectrum — experiences with few virtual elements imposed on your real world fall to the left, and experiences that replace most (or all) of your environment fall to the right.

This concept deserves an article of its own, but overall, this is a small taste of what’s coming. If there’s one thing we can be certain about, is that the VR+MR world will look fundamentally different in 5 years — and that’s exactly what makes this space so much fun.


Lucas Rizzotto is an Immersive Experience Designer, Developer and Founder of a Virtual & Mixed Reality Studio based in New York City. You can follow him on Instagram, Twitter, Facebook or contact him through his website.

Also, sign up to my mailing list so you can see more of my stuff (:

Leave a Comment

Start typing and press Enter to search