Prospect 1.5: The Release You’ve Requested

While we’re in the business of building software, our ultimate goal is to help you communicate powerful ideas spatially through virtual reality. Every new feature is meant to make your designs easy to understand for all parties involved, putting the unbuilt at the user's fingertips so virtually, it’s almost tangible.

Our team has been hard at work making improvements both visible and invisible to users; the feedback we get from our user base and industry experts is a vital part of our development process. With that in mind, here are new features we've developed for the release of Prospect 1.5:

Connecting Prospect and Scope 

Our two software offerings - Prospect and Scope - have always allowed spatial designers to communicate through either virtual reality walkthroughs or mobile panorama viewings. While these tools are great for communicating ideas to others, the two software offerings never communicated with each other until now.

Our new 360° Capture tool for Prospect connects our desktop VR experience and our mobile VR experience for the first time. This new tool allows you to quickly and easily capture stereoscopic panoramas from within Prospect, and these images can be uploaded and viewed within Scope. Not only does this tool help enhance client communications on a particular project, but it helps you build a VR portfolio that can win new business.

We’ve added the 360° Capture tool within the same menu interface as the standard image capture tool. Users can simply toggle between the two capture methods using the menu on the controllers. These captures will automatically be saved to the desktop of your computer. Scope users can upload these captures into the Scope library to sync to their mobile devices. Hovering over image thumbnails in Prospect will show which captures are panoramas with a 360° icon indicator.

Layer Updates in Scale Model Mode

SMM Layers

We added the ability to toggle on and off layers while in Scale Model Mode. Users can also easily reset changes made to layer selections with one click.

Edit Your Brush Size in the Markup Tool

Brush Size

We’ve added the ability to adjust the size of the brush tool.

In-VR Viewpoints Menu Added

Viewpoints Menu

We’ve added an interface within Prospect that lets you browse and quickly jump between all viewpoints present in the model - no keyboard required! If you prefer to use keyboard controls, they can still be used as before. 

Display Settings in Scale Model Mode

SMM Updates

Another Prospect feature previously accessible solely through the keyboard has found its way to the in-VR menu. We’ve added the ability to turn materials and outlines on/off to Prospect’s in-VR menu. These Display Settings can easily be accessed and toggled off and on without necessitating the use of the keyboard (though keyboard controls can still be used as before). 

 

Have a suggestion or feature request you'd like to send our way? Fill in this survey to help inform our future releases. 

Employee Spotlight: Jack Donovan

Jack Donovan has been with IrisVR since the beginning - and a lot’s happened since his last Employee Spotlight in August 2014. Jack shares his thoughts on what’s changed at IrisVR over the past three years, what it’s been like to watch the company scale, and what he’s excited for in the coming months.

Jack-Donovan-Pic

 

What is your position here at IrisVR?

I’m a software engineer on the game engine side, and I’m working primarily on our desktop software, Prospect. Prospect grew out of the early tech we started building when I joined in June 2014, but since then we’ve expanded our app offerings to include Scope, our mobile app.

What’s changed over the past three years and what has it been like watching the company scale?

So much has changed, it’s hard to pick where to begin. The office has evolved from a cozy corner in a co-working space in Vermont to a sizable office in New York City with plenty of workspace and conference rooms (which are essential, now that we’re a team of 20+ people).

The technology we’re working with has changed so much over the past three years, too. When I started at IrisVR, the only headset available to developers was the very first development kit of the Oculus Rift (DK1), and its relatively low resolution and refresh rate made it feel more like a preview of virtual reality than VR itself.

Since then, we’ve seen the advent of room-scale positional tracking in VR, controllers that enable us to see and use our hands to interact with virtual worlds, and HMDs with such high resolutions that pixels are barely perceptible; early VR displays had such a low resolution that the pixels were clearly visible, creating a blocky visual artifact dubbed the “screen door effect.” What’s most exciting is that even with so many improvements in a short three years, we’re still just in the beginning. I’m sure we’ll look back at what we’re working with now and see it as comically low-tech in a not-too-distant future.

What is your favorite part of your job?

My favorite part of this job is challenging preconceptions about what can and can’t be done in virtual reality, especially my own. Our software contains at least two or three features that I thought to be too difficult or even impossible at one point; VR development has historically been a strenuous and manual process because there are so many specifications to make and cases to consider, and automating at such a huge level can seem daunting at times.

Questioning the tenets of 3D development in light of the VR revolution keeps me excited and never too comfortable. It’s awesome to see others in the VR space redefining old understandings, too; Valve’s announcement last year of a new renderer that can process 18 lights in a single forward rendering pass—as opposed to the existing one-pass-per-light standard in forward rendering—struck me as a lesson that stale assumptions are the antithesis of progress.

What excites you the most about VR?

When I talk to anyone about VR, they’ve always got an idea for what they’d like to experience in it. Unfortunately, it usually remains just an idea; VR development is prohibitively difficult and nuanced, and even with the perfect vision of how something should look and feel in VR, it takes hundreds of hours of learning to understand each component that goes into a full experience.

IrisVR is exciting to me because it enables creators in spatial industries - architecture, construction, design, etc. - to experience their concepts in virtual reality without having to become VR developers themselves. Automation in general is interesting to me as a programmer, but automation at a scale that opens new doors for people is extremely exciting to be a part of.

What recent VR developments have caught your eye, and why?  

I’m really interested in the recent developments towards making desktop VR wireless. Every headset out right now requires the user to constantly be tethered to their computers by the cables that connect the two, and the possibility of tripping over a cord or feeling a tug when you step too far are a substantial detriment to immersion.

This is a tough problem to solve because of the relative slowness of wireless transmission, and ironically it will only get tougher as headsets get better. Higher resolutions and faster refresh rates mean even more data to send wirelessly, and as our current methods of wireless transmission reach their bandwidth limits, we’ll need to find new ways to keep up with the immense amount of information contained in our virtual worlds.

Testing Prospect: Is it True to Scale?

In a previous blog post, we presented two experiments that sought to answer one question: is virtual reality true to scale? In this post, we’ll expand on that idea by demonstrating methods of measurement between VR and real-world objects in Prospect. Because our software is used by architects, construction firms, and other spatial designers, the accuracy of in-VR scale is critical to the success of a project.

Why Scale Matters in VR

When exploring a VR environment, users expect that their model is sized precisely to match how the space would feel in reality. It’s possible to determine if a hallway is too narrow or if a doorway is too low using VR, but this also means that inaccurate scale can lead to misinformed design decisions. We conducted this test to verify that Prospect presents spaces at a physically accurate scale and to guarantee that you can make precise spatial decisions using a well calibrated VR headset.

Room Calibration Aligns the Virtual to the Real

Positional tracking is a big part of the magic behind high quality VR experiences. As the headset moves through space, the view updates multiple times per second to represent that movement. This makes the experience in an HTC Vive or Oculus Rift feel much more realistic than that of a mobile VR solution such as Google’s Cardboard and Daydream headsets or Samsung’s Gear VR (mobile VR devices only track rotation, not position).

The Vive and Rift use different methods for tracking position, but they are both similar in that they use external powered devices to determine where the headset is in space. These tracking devices have a field of view, and tracking quality is usually better when multiple trackers have vision of a single tracked object.

To determine relative distance between the virtual and real worlds, especially the distance to the floor, the Rift and Vive leverage a user-defined room calibration. If this calibration is wrong then the virtual world may not align with the real world. For example, when the worlds are misaligned, a headset or controller placed on the ground in the real world will appear to float or sink through the ground in VR.  In the picture below the VR ground plane is about an inch too high. If a measurement is taken with a ruler from the real ground to a VR shelf, it will be off by an inch. It is important to run room calibration if you believe there is a chance a base station or sensor has moved since the last calibration, or if you find the virtual floor is not where it should be.

 

Test Process

For this test we use Sketchup and Revit to create 3D models of boxes of known sizes. Our expectation is that the size of the box in VR should match the real world. This can be verified by placing a tracked motion controller at either end of the box and measuring the distance between the controllers.

In our test, the Vive base stations pointed down towards the center of the tracking area from opposite corners. The Rift sensors for the headset and Oculus Touch were placed on a table looking out parallel to the ground and each other. Since the Vive’s base stations “view” the floor from above, they generally have better tracking of the floor than the Rift. The Rift may have blind spots, especially as you move close to the sensors. For this reason we are able to get good results in measuring distance with the Vive close to the ground whereas with the Rift we had to place the Touch controllers higher up or lower the sensors.

Results

In our tests, we found scale in Prospect to be generally accurate. That said, we found some differences between the Vive and Rift headsets in cases when the tracking sensors do not have a full view of the controllers.

In an ideal tracking configuration, scale in the center of the tracking space generally deviates less than 0.7cm or ⅛ of an inch. This was measured with a 1m box and a 4 ft. box. With a 2m box it is difficult to keep both controllers within the best tracking area in the center of the space and the error increases to as much as 2cm.

The Vive and the Rift use different methods for determining the position of the floor. The Vive uses the position of controllers that are placed on the floor during room calibration to determine floor height, while the Rift asks the user to enter their height and tracks a controller placed at eye level. It seems that the Vive’s floor height may drift noticeably from session to session; this may be the result of the base station’s position changing slightly. With both headsets it is important to run room calibration if there is any question that if the sensors or base stations have moved.

In conclusion, our tests show that it is possible to have a physically accurate VR experience in Prospect. For the most accurate experience, it is important to pay close attention to your headset and tracker configuration during room calibration. The Vive will give you better accuracy out of the box compared to the Rift though the Rift is quicker to set up.

For more information on setting up the Oculus Rift or HTC Vive, see these guides or feel free to reach out to our team:

Scale Model Mode Update: Section, Scale, and Rotate

Here at IrisVR, we’re regularly prototyping the in-VR user experience as we design our software. In order to create useful and intuitive virtual tools, we often think about the end-to-end process of how a spatial concept develops into a completed building project. This includes the process of making physical scale models, which are effective at communicating complex space but are time consuming to keep up with iterative design changes.

In our latest update to Prospect, we reimagined our virtual equivalent of the physical model - Scale Model Mode - and enhanced its capabilities.

What is Scale Model Mode?

Scale Model Mode lets you view a virtual rendering of your model in the same way that you would view a traditional, physical model.

You can lift and move the scale model around space...

...zoom in on your model...

...and rotate.

Additionally, as part of our latest release, you can now section your model! This allows you to view your model from new perspectives and teleport directly into interior spaces (especially useful for large, multi-story projects).

You can section the model vertically...

5_Section_Hand.gif

...horizontally...

...or however you’d like!

It’s our goal to create tools that are easy to use, fit into your existing workflow, and make it easy for you to iterate upon your ideas. Want to learn more? Read our release notes for full details on our latest update.

Behind the Scenes at IrisVR: Designing VR Software

With our latest update to Prospect I wanted to share our design process with you, our users, because your involvement is so essential to the evolution of our software.

Designing virtual reality software often requires approaches, tools and ways of testing that are different from traditional software development. At IrisVR we have an incredible team of researchers, designers, prototypers and developers to meet these challenges. As we tackle new problems, we use the steps outlined below to guide our design process and to find the most effective and powerful solutions.

Our users come from all types of backgrounds. There are people using VR to look at houses, skyscrapers, furniture, cars, shoes, toys, construction sites, data visualization, and so much more. We have 3D software professionals and amateurs, VR veterans and first timers. They work all over the world and work in companies of all sizes. It’s crucial that we consider all of them and set up a process that listens to their needs first.

1. Listen

Concentrate on the Users.
What are the tools our customers are requesting? What would make their workflow smoother and save them time? What will help them to design, create and communicate their vision? Recently, users expressed interest in ‘zooming in’ to their model, but what do they actually want? Why would this be useful and where would it be the most helpful?

Dancing

2. Learn

Try Everything. 
Familiarity with the VR space is critical. We play video games in VR, on mobile and on console platforms. We test out design, productivity and creative software suites. Our belief is that every experience, from a clunky interface to a simple mobile app can be helpful in improving our way of thinking. We observe the world around us to see what people do and how they do it.  We have fun, document it, and refer to the lessons we learned throughout our design and development process. We learned that ‘zooming in’ on a screen may mean something very different in virtual reality.

3. Imagine

Consider a Variety of Unique Solutions.
We make sure to document and sketch every possible solution we can come up with that relates to the problem. It’s especially important to document solutions that may seem overly ambitious, or even impossible.

throwing_extended2.gif

4. Experiment

Make Something.
We have to jump in and start somewhere. It’s time to dive in and make something simple and quick. If we are going to be designing an object, an interaction, a menu, a scene, a control, we have to start by seeing some element of it in virtual reality. It’s already time to add more ideas to the ‘Imagine’ list. We can also break down some of our early assumptions. It seemed obvious that objects in virtual reality should have gravity. But we were surprised what happened when they did. It turns out people really like to throw stuff in VR...

5. Iterate

Build It.
This is my favorite part. A lot of quick variations are made and tried. These are not full solutions, but we start to find the puzzle pieces that can be assembled later. There are easily more than 20 ways to simply ‘pick up’ an object in virtual reality. Some of them restrict direction, others allow rotation or remove gravity. We made a virtual room full of options and dove into virtual reality to try them out.

6. Prototype

Clean It Up.
This is where we consolidate the most successful aspects of our ideas so far. Based on tests, feedback and observations, we narrow down the iterations to a few refined prototypes. These are prototypes that can be shown to people outside the company and are testing more than one interaction at a time. We often find that the prototypes end up having their own advantages for different users. Designers who use Rhino, Construction professionals who use Revit and Architects using SketchUp all may prefer slightly different tools and features. It’s our job to then refine the prototypes even more to accommodate each user. 

0_Combined.gif

7. Test

Let people break it.
There is a constant stream of users in our office who test out different features and give us helpful feedback. They try existing options, new prototypes, and we let them do the talking. There are instructions and questions, but at the end of the day our guests usually have more than enough to say on their own. We push them to be honest and to make suggestions. More often than not, they think of things that didn’t even occur to us. Throwing things in virtual reality may be fun, but people usually want their stuff back.

8. All Done!

Just kidding.
Based on the feedback, the prototypes and our own expertise, it is time to make some important decisions and move forward. The feedback will determine if we explore a new set of ideas or refine and re-test. Maybe, just maybe, there is a feature that is ready to be built and shared. 

With the release of our new tools that allow you to translate, rotate and scale your model, we found the solution was a combination of existing solutions and new approaches. With every feature we look forward to the user feedback that helps make each tool even more valuable.

In the end, this is just a piece of the process that allows us to design and develop great software for designers, architects, engineers, construction professionals and more. User feedback is the catalyst for each improvement to our software and we ask you to send any ideas, feedback, requests or questions directly to info@irisvr.com