Navigation in virtual reality has been a challenge that developers across every industry have been struggling with since they first felt the urge to explore beyond simple head rotation. Several solutions so far have been sufficient but imperfect; controllers and keyboards can cause nausea from the lack of perceived inertia, and gaze-based movement can feel unnatural. Fortunately, the Vive VR headset being developed by HTC and Valve is allowing developers to take a literal step forward when it comes to moving around their scenes.
Using Valve’s Lighthouse technology, the user’s physical location is constantly tracked from a number of small base stations positioned around a room. This allows users to actually walk around their designs in VR for the first time, which means that not only can they see how their new kitchen will look, but they’ll know how it will feel to walk from the kitchen into the living room.
The user’s position isn’t the only thing that’s tracked, either; Valve’s handheld controllers for the Vive are similarly tracked using the Lighthouse system, so users can see where their hands are in virtual reality and use them to move furniture, point out key areas, and write notes on any surface of the design. The precision of the tracking makes the system feel completely uninhibited and intuitive; move the controllers closer together in virtual reality and as soon as their representative models touch in VR you'll feel them click together in real space.
As we get closer to “real” interaction instead of symbolic interaction in virtual reality, experiences get more immersive and less nauseating. The only shortcoming in the Vive’s positional tracking paradigm is the inability to move up or down (unless you happen to have a jetpack, of course). Fortunately, preliminary informal tests have shown us that smooth, linear, vertical motion in VR is very unlikely to cause nausea, so elevator-like systems are a great way to experience multiple floors without having to load them separately.