A variety of VR prototypes built for the Windows Mixed Reality platform.
Microsoft
2 Years • Aug 2016 - Aug 2018
Prototyping, Gameplay Programming, Technical Art
In October of 2017, the Windows 10 Fall Creators Update introduced the new Windows Mixed Reality (WMR) immersive headsets and added virtual reality (VR) support to Windows. This was Microsoft's first foray into the VR space after shipping the Hololens a year prior.
I was a part of the Mixed Reality design team that had just finished work on the Hololens, and shortly thereafter transitioned into pre-production efforts for VR.
Within the Mixed Reality design team, there was an even smaller prototyping team. As a member of that team, I was responsible for creating prototypes that explored new designs for the Mixed Reality platform. One of our challenges was to adapt the Holographic Shell on the Hololens, over to VR headsets running Windows.
I've selected an assortment of prototypes from my time on the Mixed Reality design team that best represent my work.
We needed to create our own version of 6DoF (six-degrees-of-freedom) controller teleportation for the WMR shell and all of our prototypes. Our intent was to mimic Valve's VR locomotion methods and build a Unity demo that'd run on SteamVR.
🎯The Goal: Create a functioning teleportation system and start to explore different ideas for locomotion in VR.
This was an essential prototype because we needed the teleportation mechanics to traverse space more easily in VR. While smooth movement (or walking) was possible, it would also cause motion sickness problems. The prototype features:
The teleportation system created for this prototype was reused in all of our future VR prototypes, and the core functionality made its way into the shell. After this prototype, there were several others that pushed the teleportation mechanics further, and the final version received a great deal of visual design polish.
When users first launch the Windows Mixed Reality Portal app, they are taken through a device setup process and an introductory tutorial, called the "First Experience" or the out-of-box-experience (OOBE).
The First Experience was designed to have a final portal that would transport users to their home environment (known as the Cliffhouse). I was asked to work on pre-visualization for this final portal.
🎯The Goal: Create several different VFX explorations for the First Experience's final portal.
I collaborated with Scott Petill, who designed the entire user flow of the First Experience. It was meant to be a self-contained experience that teaches users how to interact and get around in VR.
I created a few different visual effects (VFX) concepts that experimented with the look of the First Experience's final portal. I also worked on the interaction and animation that occurs when a user walks into the portal.
The First Experience that's currently accessible in the Mixed Reality Portal has a final portal that's fairly reminiscent of the ones that I created for the prototype. You can see this final result in the video below:
The purpose of this prototype was to evaluate the use of articulated hand tracking to touch and interact with virtual objects. We wanted to figure out how much feedback, or assistance, the shell should give when placing objects, and what level of precision we could expect from hand tracking.
🎯The Goal: Create a 3-game hand tracking prototype that has difficulty modes with varying amounts of feedback and assistance.
The design for this prototype was created by Scott Petill. It consists of three games with easy, normal, and hard modes that respectively adjust the amount of feedback and assistance given to users. Each game was designed to support grabbing, holding, releasing, and nudging (pushing) objects.
Interacting with objects, or holding them near their objective, triggers both visual and audio feedback. To assist users, objects that are released within proximity of their target destination automatically snap to it.
We used the Leap Motion for hand tracking in this prototype because it already had support for Unity. I modeled and set up all of the assets to provide an adjustable level of feedback (color and audio) and assistance (object snapping) when manipulating objects.
The prototype is made up of 3 different mini games that require increasing levels of hand precision:
The prototype was handed over to user research, and learnings from that carried over to the hand tracking and instinctual interactions developed for the Hololens 2. Researchers found that:
This project was started during a time when the team was working on multi-user experiences. There was a desire to revisit the shell's UI, which just showed apps, and expand it to include contacts (people or friends), events (meetings), places, groups, and invites. We were asked to take the Start Menu and push it in a new direction, where people, or social interactions, were the primary focus.
🎯The Goal: Design and prototype a new 3D Start Menu that expands upon its base functionality and puts people first.
Before designing the new menu, I spent some time gathering references and looking around to see what concepts other teams had already come up with.
We created a number of sketches and mockups trying to figure out what the menu should look like. Special attention was paid to the layout of the contacts list. We also began to lean towards encapsulating the UI in volumetric (3D) bubbles.
The People First Menu was designed around the friend invite scenario, where a user can invite their contacts into a group, change the group's current activity (app or place), and send notifications to group members.
I was responsible for designing and modeling the menu. I wanted to embrace VR by creating a more volumetric UI that supported direct interactions. Additional effort went into simplifying the menu and aligning it with the visual style of the shell.
I worked with Jonathan Palmer on this prototype. It was built as a proof of concept, where there's no real data behind its functionality. Our goal was to see how the volumetric UI looks and feels to interact with. Users can send invites to people by dragging and dropping them into an activity, meeting, or group (the physical tray).
We hoped to use our learnings from this project to guide the shell's UI redesign. Unfortunately, most of our team's multi-user efforts never went anywhere, but I did eventually move over to the AltspaceVR (a social VR app) team, where I was able to have an impact on their UI instead.
Another team was working on a new mixed reality app that would be capable of streaming a user's desktop to a window (i.e. portal or slate) in VR. They wanted to see how it felt to move mixed reality apps between the 3D environment and the 2D desktop window. I was asked to turn their concepts into a working prototype.
🎯The Goal: Create a prototype that allows users to move apps between the mixed world and the virtual desktop portal.
I collaborated with Noe Barragan, the designer on this project. In the designs, if an app is dragged out of the desktop window, it will be converted into a 3D app slate. If an app has no VR capabilities, it will remain within the virtual desktop and provide negative feedback. The opposite is true for VR apps that don't support the desktop.
This prototype imitates some of the virtual desktop's functionality. Users can spawn apps into the mixed world and freely move them in and out of the desktop window. Certain apps were set up as edge cases, where they're unable to enter or leave the desktop.
The team working on the virtual desktop app were able to make use of my prototype, but they never ended up shipping the product. Fortunately, other solutions already exist that let you see your desktop in VR, like the Virtual Desktop app.
We needed to build a chaperone system to prevent users from bumping into real-world obstacles as they navigate virtual environments. The chaperone is a boundary around the area that's safe for a user to walk around in. I was asked to build a prototype that explores the visualization of this chaperone.
🎯The Goal: Create a prototype that implements the initial design of the chaperone and includes tunables for adjusting it.
The chaperone's visual design was created by Rudy Vessup. He provided me with the textures and animations seen in the clip below. The video demonstrates an animation that plays whenever a user's hand gets too close to the chaperone.
In this prototype, the chaperone is activated as the user approaches the edge of their space, fading in as they get closer to it. If the user is moving quickly, the chaperone responds in turn, rapidly fading in to help them avoid any obstacles.
The chaperone became an important safety tool in our shell, allowing users to enjoy their VR experience without worrying about running into objects around them. It received a couple of iterations after this prototype before making its way into the shell.
Another team asked me to a create a quick menu prototype, that would help them test and validate their ideas. The quick menu was designed to pause and overlay on top of a user's VR experience, providing them with a simple set of options and system controls.
🎯The Goal: Implement different quick menu designs to help the team get a feel for which designs they like the most.
I worked with Noe Barragan on the prototype, he created the different quick menu designs and I set them up in Unity. This prototype features several different interactive volume sliders and alternative layouts for the quick menu (not shown in the video). I used a 360° video to demonstrate what happens when a user opens the quick menu, pausing their current VR app.
While I don't know what the other team did with this prototype, the current shell does feature a quick actions menu that lets users record videos, take pictures, change the volume, access SteamVR, and travel to their Mixed Reality Home.
During the development of Window Mixed Reality, the team held a week-long level jam to brainstorm ideas for new VR environments. We were asked to concept, model, or prototype these environments and present them to the team.
🎯The Goal: Create a prototype that demonstrates a unique environment concept.
For my environment, I wanted to build a home space that lives on a small planet, looking out into space. One half of this planet would sit in sunlight, and the other would be encased by darkness. The idea was predominantly inspired by Super Mario Galaxy and Kaiba (an anime).
I created some simple models and threw them together in Unity with an earth backdrop borrowed from the Unity Asset Store. Since there wasn't much time, I just wanted to convey the basic idea as quickly and simply as possible.
For the prototype, a lot of my focus went into reconfiguring our VR locomotion system to support both arc and straight-line teleportation. I also modified the teleportation to align with any surface normal, mimicking Mario's movement in Super Mario Galaxy.
Nothing became of this prototype, but my team had fun with it and I was happy with what I created in just a week.
When the design team started working multi-user experiences, they also began research into user avatars (how people are represented in VR). One of these research proposals was for abstracting a user's state and representing it through their avatar's animations.
🎯The Goal: Create a prototype that communicates a user's state through their avatar's animations.
For the prototype, I rigged an Xbox avatar and gave it IK controls that would allow users to puppet it. I also created an additional avatar that would play different animations based on its current state. Along the wall, I added buttons for changing the avatar's state and toggling parts of their body on or off.
We wanted to see if the animations would be useful for covering up certain situations, like when a user goes idle and their IK tracking puts their avatar into an awkward position.
Unfortunately our team split up, so nothing really became of this prototype, but some of us were moved over to the AltspaceVR team, where we were able to continue the work on avatars.
Being a part of the Windows Mixed Reality design team was a great experience. It was nice to work on VR again after spending two years working on the Hololens. I enjoyed having the opportunity to create a variety of prototypes, each exploring some sort of new interaction.
Ultimately, I was able to help the team ship the WMR shell as a part of the Windows 10 Fall Creators Update.