How can we...?
How can we build a multi-modal tool for people with limited mobility in their arms to create art in Virtual Reality?
I focused on rapid prototyping and interaction design in this project: - I prototyped the eye tracking tool experience in Virtual Reality using laser pen attached to a baseball cap. - I researched users' interaction and emotions to design interaction tailored to users' goals and emotional needs and create a consistent end to end drawing experience.
We started with users who have cerebral palsy (a group of disorders that affect muscle coordination and body movement) with limited mobility in their arms.
Throughout the research, we found out that drawing is universally an emotional response. Users most likely use simple vocabulary to address their feelings if they finish drawing or if they need someone to help them. On the other hand, new technology can be intimidating and requires additional attention to human ergonomics and environment. The technology adoption curve is (fairly low) because some users use modal phone technology for communication. In the interview, we also showed users a clip of a virtual reality film. Throughout the process, most users would ask for a specialist’s presence. In terms of human ergonomics, some users have limited arm mobility, and they get tired quickly.
- How can we bridge the gap between the traditional pen and paper mindset to drawing in space?
- How can we create a safe and comfortable environment in an isolated experience that tailors to human ergonomics?
2D Rapid Prototype
Our rapid video prototype helped us understand that there is more research to be done since we still have questions about how we can adapt users 2D drawing to drawing in space but still feel safe and comfortable?
User Testing I
We decided to build a paper prototype and conduct user testing. The idea was to build something simple to translate our design. To create a realistic testing (Virtual Reality) environment, we used a laser pen as eye-tracking technology.
Post User Testing I - Persona
We created personas and to help us recognize our core users and whether our solution is scalable for helping users along the spectrum. Below is a persona example.
Post User Testing I - Interaction Journey
In order to create a safe and sound experience, we found that users felt comfortable trying out new things when they were in a sound environment. We wanted to understand the relationship between building trust and human interaction.
According to our research data, we mapped out a user's interaction journey to understand a user's emotional triggers and how they interact with other people.
Post User Testing I - Emotion Journey
To further deep dive into how users felt while drawing, we asked users to describe how they felt throughout every step of the process.
To go from 2D to 3D, we had to consider human ergonomics, motion sickness, and peripheral perspective in the immersive environment. Below are some example of our Virtual Reality sketches.
We mapped out each tool's interaction within the system. We also prototyped the drawing experience in Unity. The video below is a recording of the drawing experience in Virtual Reality.
User Testing II
After we created an interactive prototype, we conducted a second round of user testing. Here are some of the user testing insight:
- When designing for any interactivity in immersive space, we should consider the result of a heavy headset.
- The app should direct users to where they should pay attention to, and this is the best app ever