LEAP into Action

Deploying immersive virtual reality experiences for mobile devices in headsets like Google Cardboard, Oculus Rift or Samsung VR Gear has its own unique set of challenges when it comes to user interaction.  These days, tapping on the device screen is second nature when it comes to user interfaces on mobile.  However, when your device is mounted in a virtual reality headset, tapping the screen is out of the question. So you’re left with the problem of how to invite user interaction that is both intuitive and consistent without taking the user out of the experience and causing frustration.

 

 One potential solution that we have been dabbling in recently involves the LEAP Motion in tandem with the VR headset.  The LEAP Motion is a small sensor that can detect hand gestures using infra-red light.  When mounted on the outside of the VR headset, the inputs detected by the LEAP Motion manifest themselves as 3D modelled hands within the virtual reality experience.  This really gives the user a sense that their bodies are actually inside a virtual world and they can interact with their environment.  Using something as second-nature as hand gestures allows for a wide range of interaction within these experiences - like opening doors to transition between scenes or to tap on buttons and really explore the virtual world they are placed inside. 

 

Is it perfect? No - not yet.  Some frustration can arise when the LEAP is unable to detect the particular hand motion you are giving it.  Since the the infrared beam is emitted outwardly from the LEAP Motion - some hand gestures confuse the LEAP Motion. For example, when a user’s hands are in a fist position in front of the LEAP, it can’t see any fingers, it doesn’t know what it is seeing.  However, this is definitely an area worth exploring further as we look to bridge the gap and suspend reality within these virtual worlds. Stay tuned for more advancements.


Here is a quick video that shows off a demo of how the LEAP Motion and VR headset work together.  The hand models used are very rudimentary in their demo iteration but can be switched to a more life-like model.