Principles of Interaction Design

6
6 Principles of Leap Motion Interaction Design Interaction design can be a delicate balancing act, especially when developing for VR. In the process of building applications and various UX experiments at Leap Motion, we’ve come up with a useful set of heuristics to help us critically evaluate our gesture and interaction designs. You can see these lenses in action in our Planetarium series, where we experimented with bringing together several dierent UI widgets. It’s important to note that these heuristics exist as lenses through which to critique and examine an interaction, not as hard and fast rules. #1. Tracking consistency The team at Leap Motion is constantly working to improve the accuracy and consistency of our tracking technology. That being said, there will always be limitations to any sensor technology, and the Leap Motion Controller is no exception. Spending the time to make sure tracking is consistent for your particular interactions early will save you headaches down the road. When developing a motion or gesture, take the time to have multiple people perform the action, while you watch the resulting data in the diagnostic visualizer. (Be sure to check out our quick guide to human-driven UX design.) Take note of inconsistencies between multiple people and multiple attempts at the motion by a single person. For instance, hands near the edge of the device’s eld of view are harder to track, as is the side of the hand (versus the palm or back of the hand).

description

Principles of Interaction Design

Transcript of Principles of Interaction Design

Page 1: Principles of  Interaction Design

6 Principles of LeapMotion InteractionDesignInteraction design can be a delicate balancing act,especially when developing for VR.

In the process of building applications andvarious UX experiments at Leap Motion, we’vecome up with a useful set of heuristics to help uscritically evaluate our gesture and interactiondesigns.

You can see these lenses in action in our Planetarium series, where we exper‐imented with bringing together several different UI widgets. It’s important tonote that these heuristics exist as lenses through which to critique and exam‐ine an interaction, not as hard and fast rules.

#1. Tracking consistencyThe team at Leap Motion is constantly working to improve the accuracy andconsistency of our tracking technology. That being said, there will always belimitations to any sensor technology, and the Leap Motion Controller is noexception. Spending the time to make sure tracking is consistent for yourparticular interactions early will save you headaches down the road.

When developing a motion or gesture, take the time to have multiple peopleperform the action, while you watch the resulting data in the diagnostic visu‐alizer. (Be sure to check out our quick guide to human-driven UX design.)Take note of inconsistencies between multiple people and multiple attemptsat the motion by a single person. For instance, hands near the edge of the de‐vice’s field of view are harder to track, as is the side of the hand (versus thepalm or back of the hand).

Page 2: Principles of  Interaction Design

The Leap Motion API also exposes a “tracking confidence” level to show howaccurate the tracking thinks it is at that moment. How consistent is trackingfor your motions, given many people performing the motions many times ina real-world environment?

#2. Ease of detectionOnce you know the motion you’ve created has relatively consistent tracking,you’ll want to have a concept of how easy they are to detect. Are there obvi‐ous conditions that define the motion? How well is it separated fromother things you might want to detect? Is it obvious when the motionhas begun and ended?

On the surface, ease of detection might seem like a primarily technical con‐cern, rather than being within the purview of design. In reality it, like manythings, bleeds well into the space of design. For one, the easier the motionsyou’ve designed are to detect, the less time you’ll spend optimizing the detec‐tion code, and the more time can be spent improving the overall experience.Easier-to-detect motions will also have lower rates of false positive and nega‐tive detections, making the application experience more useable.

Secondly, and more concretely, the sooner you can accurately detect the be‐ginnings of a motion or gesture, the sooner your interface can provide theproper feedback and behaviors. This will lead to an application that feelsmore responsive, and makes people feel more in control. It also means youcan provide more ways for people to adjust for errors and subtly modify theirinteractions to fit their particular use patterns.

#3. OcclusionOcclusion from various motions commonly comes in two forms. The first,and most simple, is when something about the motion physically covers thesensor. When a person has to reach across their body and the sensor, theirsleeve, arm, or jewelry (say a large watch or a loose bracelet) can prevent thecontroller from getting a clear view of their hands — reducing tracking accu‐racy or preventing it entirely. If these sorts of actions are common to yourapplication, you may consider changing your real-world-to-screen-spacemapping, as discussed in Introduction to Motion Control.

The second form of occlusion is more subtle and can be particularly trouble‐some. When the controller can’t visibly see a part of the hand, it makesassumptions based on the data it has available and an understanding ofhow the human hand works. Often these assumptions prove quite accurate,but there are times where the system cannot reasonably provide highly accu‐rate responses. This means that if your motion or interaction commonly in‐

Page 3: Principles of  Interaction Design

volves occluded parts of the hand, the accuracy of tracking will besignificantly reduced.

One hand covering another, movements of the fingers when the hand is up‐side down, movements of the fingers when the hand is sideways and off toone extreme side of the field of view, some motions when multiple fingerscurl or come together — these can all result in this second type of occlusion.This also comes into play when the hand is presented side-on to the device,as a relatively small surface area is visible to the controller. This is also some‐thing that our tracking team is working to improve all the time.

In many cases, this comes down to testing your actions with the diagnosticvisualizer in a variety of areas around the detectable field of view, and watch‐ing for inaccuracies caused by occlusion. The more that the gestures and mo‐tions used in your design can avoid situations that cause significant occlusionover an extended period of time, the more accurate and responsive your ap‐plication will be.

#4. ErgonomicsAs society has adopted computers more and more, we’ve come to understandthat human bodies aren’t necessarily well-designed to be sitting at desks, typ‐ing on keyboards, and using mice for hours every day. Some companies haveresponded by making input devices which can be used in much more relaxedpositions, and there are large research efforts underway to continually im‐prove our posture and working environments.

Since we’re not designing a physical interface, our task as motion-controlledapplication makers is slightly different. As we are creating affordances and

As we saw in 4 Design Problems for VR Tracking (And How to Solve Them), theHovercast VR menu uses splayed hands to ensure high tracking reliability and limit

occlusion.

Page 4: Principles of  Interaction Design

gestures, we have to consider how we’re asking users to move their bodies toperform interactions, and figure out whether those movements have the pos‐sibility for causing long-term harm or strain. Furthermore, we also need tosee how tiring our interactions are, and if they can be performed from com‐fortable positions.

The most comfortable position for people to use most applications iswith their elbows resting on the table or arms of a chair. From this posi‐tion, each hand moves in a sphere around their elbow. The wrist providessome radial range, but it’s extremely limited. With the elbow on the table,wrist motion range is also incredibly limited. In particular, we must avoidrepetitive wrist motions to avoid RSIs in the carpals (carpal tunnel syn‐drome). Certain actions (rolling the right hand counterclockwise) are partic‐ularly difficult from this position, and may require users to lift their elbow.

#5. TransitionsWhen considering transitions between motions, make sure you have a clearconcept of what other motions someone is likely to perform using your appli‐cation at any given moment. Knowing that set space, you’ll be able to betterassess if any of the transitions has a high probability of being problematic.There are two primary ways in which a transition can be an issue for your ap‐plication experience.

Interaction overlap. The first is a situation in which two possible actions aretoo similar to each other. This can cause issues, both for people using the ap‐plication and gesture detection algorithms. Actions that are overly similar aredifficult to remember and have a good chance of reducing the learnability ofyour application. Reserve actions that are similar to each other for situationsin which the two actions have highly similar results.

Awkward switching. The more subtle transition issue is awkward switching.In-air motions are highly subject to interpretation by the person making thegesture. Where a motion or gesture begins can have a lot of influence on howpeople tend to perform that motion. Without any hardware to provide directfeedback, people may perform actions differently. This can wreak havoc withyour motion detection code.

Awkward switches can also cause ergonomic issues where people have tomove in uncomfortable or overly exaggerated manners. An “initialization” or

Page 5: Principles of  Interaction Design

“resting” pose from which many actions begin can be a good way to reducethe challenge of dealing with awkward switching. Make a point to analyzeand test the various interaction and motion transitions in your applica‐tion. Look for places where people are confused, see where you and yourtesters are uncomfortable, and be aware of how different transitions impacthow people perform particular motions.

The Arm HUD is triggered by flipping your arm so that the palm is facing to‐wards the user, while buttons and sliders have distinct trigger states. Readmore in Build-a-Button Workshop: VR Interaction Design from the GroundUp.

#6. FeedbackA lack of proper UI feedback can sink an otherwise well-designed interaction.When developing a new interaction, consider how you will provide feedbackfrom the application to the person performing the gesture. The lack of hard‐ware-based physical feedback in motion-based interactions leaves all theonus for communicating the state of the application (and the performance ofthe person using it) completely on the application’s user interface.

Consider how your interface will communicate if an action can be done at all,what it will do, and how will someone know what caused a false positive or afalse negative detection (so the person using your app can adjust their behav‐ior to avoid it). At a minimum, the visual feedback for a motion interactionshould communicate three things: Where am I now in terms of performingthis interaction? Where do I need to be to complete this interaction? How farand in what way do I need to move in physical space to complete or cancelthis interaction?

Further Reading

What Do VR Interfaces and Teapots Have in Common?•

Page 6: Principles of  Interaction Design

Designing VR Tools: The Good, the Bad, and the Ugly

Build-a-Button Workshop: VR Interaction Design from the Ground Up

4 Design Problems for VR Tracking (And How to Solve Them)

5 Experiments on the Bleeding Edge of VR Locomotion

This post brings together elements from Daniel Plemmons and Paul Mandel’s arti‐cles Introduction to Motion Control and Designing Intuitive Applications. Be sureto check them out for more insights on these heuristics, intuitive app design, andcool real-world examples.

In addition to these heuristics, our designers also make good use of classictools for critical interaction design analysis. There are more than a fewcopies of Nielsen’s “10 Usability Heuristics” taped to our office walls. Ifyou’re not familiar with these critical lenses, or the strategy of heuristicanalysis, check out these links.

Originally published at blog.leapmotion.com on May 17, 2015.