Usability in the Contact Center

Post on 17-Dec-2014

641 views 3 download

description

We will discuss how to measure and improve long-term user performance in contact centers, best practices in designing contact center UIs, and key considerations when launching new contact center applications. This presentation is for anyone who works with high-volume contact centers and organizations with substantial back office transaction processing.

Transcript of Usability in the Contact Center

1 © User Centric, Inc., October 2012 We Believe Experiences Matter™ Webinar Series

Usability In The Contact Center

GfK User Centric November, 2012

Robert Schumacher, Ph.D. Executive Vice President

PRESENTERS:

#uxlunch

@UserCentricInc

2 © User Centric, Inc., October 2012

3 © User Centric, Inc., October 2012

I’d like the number for Mail Boxes Etc in

Elgin, Illinois

4 © User Centric, Inc., October 2012

Results

600ms Faster

$2.94 M Annual Savings

Old Design New Design

5 © User Centric, Inc., October 2012

Another Example…

6 © User Centric, Inc., October 2012 6

The circle is above the star The circle is not above the star Yes or No?

A Positive statement, with Yes (affirmative)

response

A Negative statement, with No (negative) response

7 © User Centric, Inc., October 2012

7

The circle is above the star

The circle is not above the star

Positive statement, with Yes (affirmative)

response

Negative statement, with No (negative)

response

• Positive statements evoke faster responses than negative statements

• Yes responses are much faster than No responses (~ 500 msec)

8 © User Centric, Inc., October 2012

8

Different phrasings are used when you want different responses - Large variability in cognitive processing times - Significant impact on error

Affirmative Response

Negative Response

Negative

Positive Place all service orders? Cancel all service orders?

Do not cancel all service orders? Do not place all service orders?

9 © User Centric, Inc., October 2012

Illustrates two key things…

Importance of measurement

Impact of design

(even small changes have BIG effects)

10 © User Centric, Inc., October 2012

10

Behavior Usability is about

Measuring & Changing

Behavior

11 © User Centric, Inc., October 2012

Highest cost in the call center is human capital

12 © User Centric, Inc., October 2012

When to Measure

13 © User Centric, Inc., October 2012

14 © User Centric, Inc., October 2012

When does the center performance return to where it used to be? The performance lag time is the

time to restore performance to previous levels.

Measuring the Contact Center UI M

easu

re (

Tim

e, E

rror

s, e

tc.)

High

Low

Phase of Deployment Early Later

Current Performance Level (baseline)

Learning Curve of New Application

So the question often asked is…How much of a hit is there when we deploy this application? This is the initial performance decrement

1

2 3

What benefit do I get long term? The long-term benefit of the new interface is the amount of eventual improvement.

15 © User Centric, Inc., October 2012

Measuring the Contact Center UI M

easu

re (

Tim

e, E

rror

s, e

tc.)

High

Low

Phase of Deployment Early Later

1

2 3

Initial performance decrement Lag time lag to restore performance Long-term benefit

2

1

3

An even better design A poorly executed design

3

16 © User Centric, Inc., October 2012

Measuring & Design Process For Contact Center

17 © User Centric, Inc., October 2012

Incorporating Measurement into the Design Process

17 © November 15, 2012 – Proprietary and Confidential

Discovery

Design Evaluate

Performance Repeatedly

Benchmark Performance

18 © User Centric, Inc., October 2012

Incorporating Measurement into the Design Process

18 © November 15, 2012 – Proprietary and Confidential

Observe Use of System

Conduct Expert

Evaluation

Make UI Changes

Conduct Design

Workshops

Evaluate Performance Repeatedly

Benchmark Performance

Model Performance

Approve Design?

Yes

No

19 © User Centric, Inc., October 2012

Benchmark Performance

• Improvement = Performance after design – Baseline performance

• Often existing measures can be used.

• Any outcome measured after the redesign should be measured before the redesign.

• Don’t rely on only indirect measures (e.g., customer satisfaction) – worth the time and money to collect a baseline measure of time, errors and other measures before the redesign is launched.

• You can’t collect baseline measures after the redesign is deployed!

19 © November 15, 2012 – Proprietary and Confidential

20 © User Centric, Inc., October 2012

Benchmark Performance

• Detailed, controlled measurement of user behavior (summative, quantitative)

• In actual use

• In simulated use

• What to measure?

• Time to complete

• Errors – many kinds!

• Screens visited (efficiency)

• Satisfaction/preference

20 © November 15, 2012 – Proprietary and Confidential

21 © User Centric, Inc., October 2012

Incorporating Measurement into the Design Process

21 © November 15, 2012 – Proprietary and Confidential

Observe Use of System

Conduct Expert

Evaluation

Make UI Changes

Conduct Design

Workshops

Evaluate Performance Repeatedly

Benchmark Performance

Model Performance

Approve Design?

Yes

No

22 © User Centric, Inc., October 2012

Observe Use of System and Direct User Feedback

• Side by side or remote observations need to be based on understanding what representatives do

22 © November 15, 2012 – Proprietary and Confidential

23 © User Centric, Inc., October 2012

SAY ≠

DO

24 © User Centric, Inc., October 2012

Observe Use of System and Direct User Feedback

• Side by side or remote observations help understand what representatives do

• Need to see how screen flow matches conversation flow (customer) and business flow

• If observation not possible, focus groups or interviews with representatives or managers might be an option

• Are incentives aligned?

24 © November 15, 2012 – Proprietary and Confidential

25 © User Centric, Inc., October 2012

Conduct Expert Evaluation

Uncover things user will never notice or say (e.g., UPPERCASE font);

With the context of observation, need to identify weaknesses of design

Inefficient screen-flow

Poor screen layout issues

Missing functionality

25 © November 15, 2012 – Proprietary and Confidential

26 © User Centric, Inc., October 2012

Incorporating Measurement into the Design Process

26 © November 15, 2012 – Proprietary and Confidential

Observe Use of System

Conduct Expert

Evaluation

Make UI Changes

Conduct Design

Workshops

Evaluate Performance Repeatedly

Benchmark Performance

Model Performance

Approve Design?

Yes

No

27 © User Centric, Inc., October 2012

Conduct Design Workshop

• Design workshops combine the wisdom of the domain with experience and expertise of UI team

• Efficient way to understand technical, regulatory, and business constraints that are not evident from observations

• Makes buy-in and acceptance easier

27 © November 15, 2012 – Proprietary and Confidential

28 © User Centric, Inc., October 2012

Make UI Changes

Changes documented by information design/architecture then sketches then wireframes and often prototypes

Each company has their own software development process into which the design phase is integrated.

28 © November 15, 2012 – Proprietary and Confidential

29 © User Centric, Inc., October 2012

Model UI Performance

• If some of the measures concern speed or efficiency, it may be worth modeling how fast the redesign will be.

• Can use a cognitive model (GOMS) that estimates how long specific tasks will take on different interfaces

29 © November 15, 2012 – Proprietary and Confidential

30 © User Centric, Inc., October 2012

Model Human Processor & GOMS

• Allows estimation of performance on certain tasks without taking field measurements

• Purpose is to ensure evaluate performance at the end of the learning curve

30 © November 15, 2012 – Proprietary and Confidential

31 © User Centric, Inc., October 2012

Measuring the Contact Center UI M

easu

re (

Tim

e, E

rror

s, e

tc.)

High

Low

Phase of Deployment Early Later

1

2 3

Initial performance decrement Lag time lag to restore performance Long-term benefit

Modeling can help understand

performance here during design

32 © User Centric, Inc., October 2012

Model Human Processor & GOMS

• Allows estimation of performance on certain tasks without taking field measurements

• Purpose is to ensure evaluate performance at the end of the learning curve

• Goals, Operators, Methods and Selection Rules (GOMS) models interface actions

• Based on empirically validated results from cognitive psychology

32 © November 15, 2012 – Proprietary and Confidential

33 © User Centric, Inc., October 2012

Model UI Performance

GOMS modeling involves – Breaking down common tasks into

components of mouse movements, widget use, and keyboard entries

– Assigning average values to each step based on prior research

– Sum of steps = time for task.

Multiple systems exist for doing GOMS. – Cogtool (Bonnie John) – Composite KLM (Jeff Sauro)

33 © November 15, 2012 – Proprietary and Confidential

34 © User Centric, Inc., October 2012

Review Design Changes

• When design is complete, the business needs to approve

• Strongly recommend field reviews

• Requirements are completed – often before code is written

34 © November 15, 2012 – Proprietary and Confidential

35 © User Centric, Inc., October 2012

Incorporating Measurement into the Design Process

35 © November 15, 2012 – Proprietary and Confidential

Observe Use of System

Conduct Expert

Evaluation

Make UI Changes

Conduct Design

Workshops

Evaluate Performance Repeatedly

Benchmark Performance

Model Performance

Approve Design?

Yes

No

36 © User Centric, Inc., October 2012

Measure Performance Repeatedly

Once the design is complete, and the system is deployed, performance is measured

The same measures are used after design as were used before design

Measuring performance once is never enough – because the benefits of the design change are masked by challenges of learning a new system (see next section)

36 © November 15, 2012 – Proprietary and Confidential

37 © User Centric, Inc., October 2012

Measuring the Contact Center UI M

easu

re (

Tim

e, E

rror

s, e

tc.)

High

Low

Phase of Deployment Early Later

1

2 3

Initial performance decrement Lag time lag to restore performance Long-term benefit

38 © User Centric, Inc., October 2012

Usability in The Contact Center

Questions? Robert Schumacher, PhD. Executive Vice President GfK User Centric bob@usercentric.com +1.630.320.3900