Experience Designer
scene 8.JPG

Mixed Reality

AR music.jpg

Mixed Reality Music case study

Duration: 10 weeks

Role: 3D modeler and animation

Members: Matt Imus, Derrick Ho, Nirawit Jittipairoj

Tools: HoloLens, After Effects, Cinema 4D, Premiere Pro, Microsoft Maquette

 

Overview

This project focused on the future potential Mixed Reality has on interaction design. Our design explores the possibility of interacting with a product that has multiple levels of complexity. The end result is a solution that allows the user to quickly change a song, but also be able to summon more complicated controls when they want to dive deeper into the experience.

 

 

Research

As interactions become more complex, we’ve moved away from physical interfaces (control panels with knobs and buttons) to software interfaces. Software allows users to engage deeper with a product. It provides an opportunity to simplify a complex interaction into manageable layers on a digital screen. On the other hand, physical interfaces are approachable, tactile and have immediate feedback. Users can turn a nob on a radio and the needle moves precisely.

With the rise of The Internet of Things, our homes are filled with physical objects with more complicated interactions. But these physical objects are no longer interacted with directly. Users turn to their phones as the middle man. Although the phone can simplify the complexity associated with the product, it pulls users away from the familiarity of the physical engagement.

By studying Mixed reality, we looked to bring the familiarity of physical interaction to a digital system. Our goal was to identify that although the UI elements may be made of light, they must behave like real objects.
By extending the metaphor of physical objects to the user interface, we intuitively understand how to interact. A UI based on objects is native to the medium and superior in use - no more clumsy air taps in an abstracted 2D interface.

braun.jpg
touch.jpg
 

Prototype

Prototyping for AR is incredibly hard. Common prototyping tools are designed for 2D screens. On the other hand, the technology available for AR are complex and prevent quick iterations, a necessity for rapid prototyping.

To mitigate this problem, we went low fidelity and sketched on paper.
Paper prototyping allowed us to get our ideas out quickly and simulate an AR experience by moving paper through space with our hands.

Through our paper sketches and endless discussions, we were able to narrow down our ideas. Once we established a plan, we each broke off into our own disciplines to push the final design forward.

paper1.jpg
 

Virtual Reality Prototype

My role was to focus on the 3D model and animation of the solution. Before I dove into the end results using Cinema 4D, I wanted to try a new prototyping tool called Microsoft Maquette. Maquette is a virtual reality tool designed to quickly mock up content in a virtual space. I pushed this program to see if I could use it to mock up Augmented Reality. It allowed me to take our paper sketches and create a higher fidelity visual. We took this information and created 3D models to bring into the HoloLens.
It allowed me to make design decisions quickly before jumping into a 3D program.

HoloLens Prototype

Although extremely rough, bringing a model onto a device helped drive design decisions. We simply dropped a model into Unity and exported it to HoloLens. Initially, we had the UI raised above the user similar to the VR prototype above. We quickly realized that keeping an arm raised is exhausting. By bringing the design lower, users no longer have to raise their arms. They can simply interact within their own arm space.

6.JPG
 

 

Final Design

The final solution involves gestural inputs with a ‘pull out’ dashboard.
Users can look at the speaker and summon a larger set of controls that allow them to see their queue and library. Our UI is designed to appear as if music is physical albums. By adding dimensional, we hope to invoke the ability to grab and drop, interactions commonly found with real physical items. When users have finished with the complicated actions associated to music, they can push the UI away, letting it return back to their device.

Findings

This projected proved to be challenging for all of us. We have never designed for Augmented Reality and therefore struggled with identifying new metaphors that would align with the technology. Given more time, we would love to improve our final video as the motion currently doesn’t align with the action. For me, This project sparked a lot of excitement and love for AR. AR provides an opportunity to combine my industrial design background with digital design love.

AR music.jpg