Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos...

17
ultimedia Specification Design and Productio 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis [email protected]

Transcript of Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos...

Page 1: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

Multimedia Specification Design and Production

2013 / Semester 1 / week 9Lecturer: Dr. Nikos [email protected]

Page 2: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

2

Learning outcomes

• Evaluation (more about this topic)

• Evaluation methods

• Empirical evaluation (necessary steps)

• Analytical evaluation

More about Evaluation

Page 3: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

3

Reading List:

1.Notes for Lecture week_8: introduction to evaluation

2.Faulkner Chapter 6, pp 137 – 146 (stop before 6.5.1), 6.6. – 6.16, pp 156 – 173 and Chapter 7, pp 177 - 196

More about Evaluation

Page 4: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

4

Evaluation

1.Central to user-centred iterative development, carried out throughout the development process (however often developers feel they can undertake by themselves)

2.Linked to every other activity in the design cycle

3.Developers are often tempted to skip itbecause it can add to the development time and costs money and effort

More about Evaluation

Page 5: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

5

EVALUATION provides the opportunity to:

• help to ensure that the system is usable

• help to ensure what users want in the final system

• end up cheaper than fixing problems identified later on

Evaluation is the process of gathering data so we can answer the questions!

More about Evaluation

• why you are doing it

• what you hope to achieve within the inevitable practical constraints:

availability to facilities/equipment easy access to users expertise

time budget ethical issues

Page 6: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

6

An introduction to Evaluation

Terminology

Evaluation

…the process of systematically gathering data at various stages within the

development process, which can be used to improve the designers’

understanding of the users’ requirements and amend the design to meet users’

needs. It can employ a range of techniques, some involving users directly, at

different stages, to examine different aspects of the design

Page 7: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

7

Evaluation methods

1. Empirical 2. Analytical

…based on user experience …based on the expert view

More about Evaluation

3. Heuristic

Page 8: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

8

1. Empirical evaluation: necessary steps

1. it should start with a clear understanding of what questions need to be

answered – i.e. what the evaluation aims to find out, set appropriate targets

2. developing the evaluation activity

selecting participants to perform tasks

developing tasks for participants to perform (benchmark and

representative tasks) – give the participants focused activities

determining protocol and procedures for the evaluation sessions

pilot testing may be necessary to improve experiment

3. directing the evaluation sessions

More about Evaluation

Page 9: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

9

1. Empirical evaluation: necessary steps

4. generating data by:

quantitative: benchmark tasks, user questionnaires

qualitative: concurrent verbal protocol, retrospective verbal protocol,

critical incident reporting, structured interviews

5. collecting data - in order to have evidence on which to base evaluation

real-time note-taking

audiotaping

videotaping

internal instrumentation of the interface

More about Evaluation

Page 10: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

10

1. Empirical evaluation: necessary steps

6. analyzing the data, comparing results with targets in usability

specifications

7. drawing conclusions to form a resolution of each design problem

8. depending on the outcome of the evaluation, it may be necessary to

redesign and implement the revisions

More about Evaluation

Page 11: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

11

2. Analytical evaluation

More about Evaluation

The following section of notes has been compiled from papers on Jakob Nielson’s website http://www.useit.com. Faulkner Chapter 7 provides more detail about the various methods briefly outlined below.

Usability inspection is a set of methods that are all based on having

evaluators inspect and analyse a user interface.

Typically, usability inspection is aimed at finding usability problems in the

design, though some methods could evaluate the overall usability of an

entire system.

Inspection methods can be applied early in the interaction development

lifecycle, allowing feedback, iteration and improvement.

Page 12: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

12

2. Analytical evaluation

More about Evaluation

• Heuristic evaluation is the most informal method and involves having

usability specialists judge

• Heuristic estimation is a variant in which the inspectors are asked to

estimate the relative usability of two (or more) designs in quantitative terms

(typically expected user performance).

Page 13: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

13

2. Analytical evaluation

More about Evaluation

• Cognitive walkthrough uses a more detailed procedure to simulate a

user's problem-solving process at each step through the dialogue, checking

if the simulated user's goals and memory content can be assumed to lead to

the next correct action.

• Pluralistic walkthrough uses group meetings where users, developers,

and human factors people step through a scenario, discussing each

dialogue element.

Page 14: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

14

2. Analytical evaluation

More about Evaluation

• Feature inspection lists sequences of features used to accomplish typical

tasks, checks for long sequences, cumbersome steps, steps that would not

be natural for users to try, and steps that require extensive

knowledge/experience in order to assess a proposed feature set.

• Consistency inspection has designers who represent multiple other

projects inspect an interface to see whether it does things in the same way

as their own designs.

Page 15: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

15

2. Analytical evaluation

More about Evaluation

• Standards inspection has an expert on an interface standard inspect

the interface for compliance.

• Formal usability inspection combines individual and group inspections

in a six-step procedure with strictly defined roles to with elements of both

heuristic evaluation and a simplified form of cognitive walkthroughs.

Page 16: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

16

2. Analytical evaluation

More about Evaluation

Heuristic evaluation, heuristic estimation, cognitive walkthrough, feature

inspection, and standards inspection normally have the interface inspected

by a single evaluator at a time.

* In contrast

pluralistic walkthrough and consistency inspection are group inspection

methods. Many usability inspection methods are so easy to apply that it is

possible to have regular developers serve as evaluators, though better

results are normally achieved when using usability specialists.

Page 17: Multimedia Specification Design and Production 2013 / Semester 1 / week 9 Lecturer: Dr. Nikos Gazepidis gazepidis@ist.edu.gr.

17

3. Heuristic evaluation

More about Evaluation

• Heuristics – broad based rules or principles derived from theoretical

knowledge (e.g. cognitive psychology) and practical experience

• heuristic evaluation is the most popular usability inspection method

• heuristics can be used to inform the design, as well as providing a checklist

for evaluation

• heuristic evaluation allows quick, cheap and easy evaluation of a user

interface design, hence known as a ‘discount usability engineering’ method

• heuristic evaluation aims to identify usability problems which can then be

fixed within the iterative design process

• heuristic evaluation involves a small set of evaluators examining an

interface and judging its compliance with recognized usability principles