Note:  As an undergrad in 2014, I successfully proposed a custom curriculum for my senior year, comprising academic and field research on interaction design for virtual reality, with design and prototype creation based on that research.  After publishing in 2015, students across several disciplines cited my work as a reference in their own research; enough students to justify the creation of a new, inter-disciplinary program of study for up-and-coming VR/AR designers.
Thinking with your hands

Understanding users' expectations for task-oriented gestures

Virtual reality headsets from many competing manufacturers produce dazzling experiences and make interactive, immersive experiences accessible to consumers. There are a myriad of choices for user input and control in virtual reality, ranging from traditional keyboards and mice or video game controllers, to purpose-built hand-held controllers, to voice and gestural interactions.

Gestural controls based on 3D imaging technologies, such as Microsoft Kinect and Leap Motion VR, free users from tracking devices or complex controllers that they may struggle to operate with their view obstructed in the headset. Gestural interaction in space is the natural evolution of contact-gestures used commonly on touch-screen devices. Actions such as a tap or swipe make sense in virtual reality, for interactions that are analogous to those performed on touch-screens. Even novice users in virtual reality can intuitively adapt to gestural interactions in a 3D environment. Science fiction and popular media have provided a pervasive historical context for the foundations of meaning in a gestural language.

This study is designed to determine if potential users of virtual reality share an understanding of gestural interaction.  In order to baseline what gestures mean, users are asked to choose and perform gestures in order to accomplish specific tasks.

My role on the project:

As a Student: UX Researcher; UX Designer; VR Prototype Developer

Other roles on the project:

Academic Director; Faculty Mentor; Independent-Study Advisor

Video interviews are conducted both in person and remotely

Though not selected for this trait, most participants enjoy the science fiction entertainment genre, indicating the possibility of shared prior exposure to visual metaphors.  Most participants have little to no prior exposure, but a few report significant experience with VR.

Participants are given a scene and scenario, then asked to imagine and demonstrate the gestures they would use to accomplish tasks.  Responses are analyzed and grouped qualitatively to reveal trends and preferences.

This study comprised 26 volunteer participants, ranging in age from 18 – 50+ years, with a median age of 28 years. Participants were 69% Male and 31% Female.  Only 2 participants own VR headsets, both for recreational purposes.

Responses for each task are grouped by similarity  and tallied to reveal trends

As participants perform hand gestures in the air, their choices are recorded for later review.

Variants of similar gestures are grouped together, and participants’ actions are tallied.

Some tasks show clear trends, while others show the need for an additional A/B test.

Followup A/B testing allows a new set of users to choose between leading options

A new group of 12 participants with no prior knowledge of this study are shown sketches of the same interaction goals used in the previous test.

Gesture choices are illustrated for users, and then demonstrated to ensure that responses are based on a correct understanding of the given options.

A simple tally of participants responses clearly reveals preferences that had not been previously apparent in the more open conceptual format of the first test.

Study results

Intuitive gestures for direct content interaction

This animation demonstrates the tasks users performed, and the gesture that most participants favor as an intuitive means of achieving the desired content interaction.

Participants agree on gestures for a significant portion of the assigned tasks.  Some of the chosen gestures could have been predicted, such as a tapping gesture used to select a piece of content.  Other gestures for more complex tasks exhibited a smaller degree of agreement among participants, and were selected as a result of subsequent A/B testing.

In some cases, participants demonstrated similar gestures using either one or both hands.  In these cases, the one-handed version is preferred, allowing it to be performed with either hand.

VR: The good and the bad

VR's capabilities come with limitations that can be overcome by conscientious design

Everest VR. © 2015 SolfarStudios

Virtual reality headsets can deliver spectacular visual experiences.

Under The Sea. © Shin Hosku, Gim Heyongin

Surroundings can be explored naturally with head and motion tracking.


In such immersive environments, user interactions designed around conventional input devices feels restrictive and disconnected.

Users expect to interact with virtual content as they would with physical objects

Xbox controller. Image © Scott Akerman

Although game controllers are familiar to many VR users, controller-based interactions prevent a complete sense of immersion in a virtual world.

STEM by Sixense. ©

Hand-held controllers that enable motion-tracking allow users to interact as they would in the real world, to a point.  These systems confine the hands to conventional buttons and switches for most interactions.

Leap Motion VR. © Leap Motion

Systems that track hand movements from the headset’s perspective let users move and gesture naturally.  Hand motions are mapped into the simulation and user interactions with virtual content can mimic real-world interactions.

Health and safety

User comfort and safety must be design priorities in VR

© 2015 Cedar Point VR

Motion Sickness

Motion sickness in VR is easy to induce, and can produce unpleasant effects for users. The risk of motion sickness can be reduced by eliminating unnecessary motion and only permitting user locomotion in direct response to user-controlled interactions.

© 2013 Virtuix Omni

User Locomotion

VR headsets obscure physical surroundings, presenting hazards to users who walk around virtual environments. User locomotion within VR should be discouraged by design unless specialized equipment is available.

© 2015 Frooxius

Central Focus of Interaction

Whenever possible, interactive content should be located centrally in the scene, and within the user’s grasp.  Locomotion should only be in direct response to direct user input, never initiated by the simulation.

Early conceptualization

Environment and interaction design is informed by prior research

A table provides a point of focus above which interactive objects can simultaneously feel contained and accessible to users.

Users may freely look around an environment, and without moving, can see most of the virtual content in a single view.

Interactive objects are sized and placed such that a user will be able to grasp and interact with them from a fixed position.

Gestures are chosen for task-based interactions

Basic tasks may be performed by hand gestures, quickly and intuitively.

Multiple gestures can be combined to achieve more complex tasks.

A set of gestures that builds on itself promotes deeper exploration.

For proof-of-concept prototyping , simple objects are sufficient to begin testing user interactions