Desktop 2.0
Information Organization + Augmented Reality
Project Overview
A 6 week long concept project where I explored using Augmented Reality to extend the desktop computer experience. How can we allow AR to supplement, rather than replace, the rich tactile experiences of computing?
#AugmentedReality    #InteractionDesign
Breaking Down the Problem...
We have an incredible amount of information. We use numerous files, folders, and applications to organize this "stuff."
“Final Project V9”
We are terrible at naming, sorting, and organizing said "stuff." This can lead to many problems down the road—namely not being able to find things!
I’m Searching for...
Our computers have search functions to help us locate our files, but they pivotally rely on human memory... which can be... poor.
The Essence of
Desktop 2.0
What if our computers didn't rely so heavily on human memory?

Desktop 2.0 imagines an interface where your virtual desktop space adapts to the state of your current activity.
Project Initiatives
Concept Demo Video
A quick introduction to the Desktop 2.0.
Design Process
In the very early stages of the project, I was unsure how I wanted to try to test and develop an AR experience. At this point, I had plenty of sketches of what kinds of experiences I wanted to experiment with...
Early Testing
I have some experience developing in Unity so I decided to make my first prototype in VR with Google Daydream. I would pretend that virtually reality is REALLY REALITY so I could play with Augmented reality properly. I thought I was really clever when I thought of this only to find out that this is how most folks prototype AR 😅 😂 🤣
Mistakes and Lessons Learned
I spent a lot of time setting up my first user testing session. My goal was to run participants though a short series of tasks related to finding files and navigating the augmented desktop experience. I ended up wasting a lot of time checking mundane tasks of finding files when I should have focused more on initial reactions and impressions.
The Biggest Take Away
The study didn't end up being a total waste because through this experience I gained a bit more insight into the physical interactions that come with an AR system supplementing the traditional computing experience.

Mainly—people don't want to use a mouse and keyboard AND a game controller.
Final Prototype: More Mistakes and Lessons Learned
I decided to animate my final demo in Maya. Google daydream was great to test and to get a scale of things, but it did poorly as a long term solution for sharing my project. Teaching myself animation software just for this project ended up being a huge mistake for my project, but  a useful learning experience overall.
Ambient Awareness
A core functionalities of Desktop 2.0 is its ability to automatically resurface files that are relevant to your current workflow.
What does search look like?
Despite claiming to have magical ambient awareness of what you might be looking for, it was also really important to look at what the search experience might look like. I'd like to make a case now for voice search on desktop. I know, most people find the Siris, Cortanas, and Assistants on browser kind of annoying, but particularly for the purposes of finding files, they could be our next best friend.
To Put it Simply...
Compared to traditional search conversational voice search has so many opportunities to better understand what results to show to the user with minimal long typing.
Streamlining the Physical Experience
Let the pre-existing user input, computer trackpad and sight, be the mode of input for interacting with objects in virtual reality. My hypothesis is that using a combination of trackpad gestures and eye tracking technology (included in AR headset), the user will be able to seamlessly transition from interacting with things in their tangible and intangible work spaces.