1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all?...

39
1 User Centered Design and Evaluation
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    220
  • download

    1

Transcript of 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all?...

Page 1: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

1

User Centered Design and Evaluation

Page 2: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

2

Overview

• My evaluation experience • Why involve users at all?• What is a user-centered approach?• Evaluation strategies

• Examples from “Snap-Together Visualization” paper

Page 3: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

3

Empirical comparison of 2D, 3D, and2D/3D combinations for spatial data

Page 4: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

4

Development and evaluation of aVolume visualization interface

Page 5: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

5

Collaborative visualization on a tabletop

Page 6: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

6

Why involve users?

Page 7: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

7

Why involve users?

• Understand the users and their problems• Visualization users are experts• We do not understand their tasks and information needs• Intuition is not good enough

• Expectation management & Ownership• Ensure users have realistic expectations • Make the users active stakeholders

Page 8: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

8

•Early focus on users and tasks•Empirical measurement: users’ reactions

and performance with prototypes•Iterative design

What is a user-centered approach?

Page 9: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

9

Focus on Tasks• Users’ tasks / goals are the driving force

– Different tasks require very different visualizations

– Lists of common visualization tasks can help • Shneiderman’s “Task by Data Type Taxonomy”

• Amar, Eagan, and Stasko (InfoVis05)

– But user-specific tasks are still the best

Page 10: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

10

Focus on Users

• Users’ characteristics and context of use need to be supported

• Users have varied needs and experience – E.g. radiologists vs. GPs vs. patients

Page 11: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

11

Understanding users’ work • Field Studies

- May involve observation, interviewing

- At user’s workplace

• Surveys

• Meetings / collaboration

Page 12: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

12

Design cycle

• Design should be iterative– Prototype, test, prototype, test, …– Test with users!

• Design may be participatory

Page 13: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

13

Key point

• Visualizations must support specific users doing specific tasks

• “Showing the data” is not enough!

Page 14: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

14

Evaluation

Page 15: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

15

How to evaluate with users?

• Quantitative ExperimentsClear conclusions, but limited realism

• Qualitative Methods– Observations– Contextual inquiry– Field studies

More realistic, but conclusions less precise

Page 16: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

16

How to evaluate without users?

• Heuristic evaluation• Cognitive walkthrough

– Hard – tasks ill-defined & may be accomplished many ways• Allendoerfer et al. (InfoVis05) address this

issue

• GOMS / User Modeling?– Hard – designed to test repetitive

behaviour

Page 17: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

17

Types of Evaluation (Plaisant)

• Compare design elements– E.g., coordination vs.

no coordination (North & Shneiderman)

• Compare systems– E.g., Spotfire vs. TableLens

• Usability evaluation of a system– E.g., Snap system (N & S)

• Case studies– Real users in real settings

E.g., bioinformatics, E-commerce, security

Page 18: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

18

Snap-Together Vis

Customcoordinatedviews

Page 19: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

19

Questions

• Is this system usable?– Usability testing

• Is coordination important? Does it improve performance?– Experiment to compare coordination vs.

no coordination

Page 20: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

20

Usability testing vs. Experiment

Usability testing

• Aim: improve products• Few participants• Results inform design• Not perfectly replicable• Partially controlled

conditions• Results reported to

developers

Quantitative Experiment• Aim: discover knowledge• Many participants• Results validated

statistically • Replicable• Strongly controlled

conditions• Scientific paper reports

results to community

Page 21: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

21

Usability of Snap-Together Vis

• Can people use the Snap system to construct a coordinated visualization?

• Not really a research question• But necessary if we want to use the

system to answer research questions

• How would you test this?

Page 22: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

22

Critique of Snap-Together Vis Usability Testing

+ Focus on qualitative results+ Report problems in detail+ Suggest design changes- Did not evaluate how much training is

needed (one of their objectives)• Results useful mainly to developers

Page 23: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

23

Summary: Usability testing

• Goals focus on how well users perform tasks with the prototype

• May compare products or prototypes• Techniques:

– Time to complete task & number & type of errors (quantitative performance data)

– Qualitative methods (questionnaires, observations, interviews)

– Video/audio for record keeping

Page 24: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

24

Controlled experiments• Strives for

– Testable hypothesis– Control of variables and conditions– Generalizable results– Confidence in results (statistics)

Page 25: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

25

Testable hypothesis

• State a testable hypothesis– this is a precise problem statement

• Example:– (BAD) 2D is better than 3D– (GOOD) Searching for a graphic item among

100 randomly placed similar items will take longer with a 3D perspective display than with a 2D display.

Page 26: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

26

Controlled conditions

• Purpose: Knowing the cause of a difference found in an experiment– No difference between conditions

except the ideas being studied

• Trade-off between control and generalizable results

Page 27: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

27

Confounding Factors (1)

• Group 1Visualization A in a room with windows

• Group 2Visualization B in a room withoutwindows

What can you conclude if Group 2 performs the task faster?

Page 28: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

28

Confounding Factors (2)

• Participants perform tasks with Visualization A followed by Visualization B.

What can we conclude if task time is faster with Visualization A?

Page 29: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

29

Confounding Factors (3)

• Do people remember information better with 3D or 2D displays?

• Participants randomly assigned to 2D or 3D

• Instructions and experimental conditions the same for all participants

Tavanti and Lind (Infovis 2001)

Page 30: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

30

What are the confounding factors?

2D Visualization 3D Visualization

Page 31: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

31

What is controlled

• Who gets what condition– Subjects randomly assigned to groups

• When & where each condition is given• How the condition is given

– Consistent Instructions– Avoid actions that bias results (e.g.,

“Here is the system I developed. I think you’ll find it much better than the one you just tried.”)

• Order effects

Page 32: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

32

Order Effects

Example: Search for circles among squares and triangles in Visualizations A and B

1.Randomization• E.g., number of distractors: 3, 15, 6,

12, 9, 6, 3, 15, 9, 12…

2.Counter-balancing• E.g., Half use Vis A 1st,

half use Vis B first

Page 33: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

33

Experimental Designs

Between-subjects

Within-subjects

No order effects?

+ -

Participants can compare conditions?

- +

Number of participants

Many Few

Page 34: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

34

Statistical analysis• Apply statistical methods to data

analysis– confidence limits:

•the confidence that your conclusion is correct

•“p = 0.05” means:–a 95% probability that there is a true

difference–a 5% probability the difference

occurred by chance

Page 35: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

35

Types of statistical tests

• T-tests (compare 2 conditions)• ANOVA (compare >2 conditions)• Correlation and regression• Many others

Page 36: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

36

Snap-Together Vis Experiment

• Are both coordination AND visual overview important in overview + detail displays?

• How would you test this?

Page 37: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

37

Critique of Snap-Together Vis Experiment

+ Carefully designed to focus on factors of interest

- Limited generalizability. Would we get the same result with non-text data? Expert users? Other types of coordination? Complex displays?

- Unexciting hypothesis – we were fairly sure what the answer would be

Page 38: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

38

How should evaluation change?

• Better experimental design– Especially more meaningful tasks

• Fewer “Compare time on two systems” experiments

• Qualitative methods• Field studies with real users

Page 39: 1 User Centered Design and Evaluation. 2 Overview My evaluation experience Why involve users at all? What is a user-centered approach? Evaluation strategies.

39

Take home messages

• Talk to real users!

• Learn more about HCI!