Introduction Fundamentals of GUIs - FH...
Transcript of Introduction Fundamentals of GUIs - FH...
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
1
• Introduction
• Fundamentals of GUIs
Overview
• Screen Design : Navigation, Windows, Controls, Text, …
• Evaluating GUI Performance
- Methods
- Comparison
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
2
Example: Automotive HMI (CAR IT 03/2013) 64 , 68, 69
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
3
Example: Automotive HMI (CAR IT 03/2013) 64 , 68, 69
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
4
Example: Automotive HMI (CAR IT 03/2013) 64 , 68, 69
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
5
Example: Automotive HMI (CAR IT 03/2013) 64 , 68, 69
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
6
Example: Automotive HMI (CAR IT 03/2013) 64 , 68, 69
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
7
Common Usability Problems 65 64 , 68, 69
• Ambiguous menus and icons
• Confusing and inefficient navigation
(meaningless graphics, useless elements, …)
• Inefficient operation, information overload
• Excessive or inefficient page scrolling
• Design inconsistency
• Input manipulation limits
• Unclear step sequences
• More steps to manage the interface than to perform tasks
• Inadequate feedback and confirmation
•Inadequate error messages, help, tutorials and documentation
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
8
Usability Requirements 64 , 68, 69
• Learnability How easy is it for users to accomplish basic tasks?
• Efficiency How quickly can tasks be performed by (experienced) users?
• Memorability How fast can rare users reestablish proficiency?
• Errors How many errors do users make and how easy is recovery?
• Satisfaction How pleasant is it to use the design?
5 quality components by Nielsen (2003):
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
9
Some Practical Measures of Usability 65 64 , 68, 69
• Are people asking a lot of questions or often use the manual?
• Are users become some kind of aggressive when using the GUI?
• Are there many irrelevant actions being performed?
• Are there many object, elements etc. on the screen which could be ignored?
• Do a lot of people want to use the product (GUI)?
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
11
Some Objective Criteria to Measures Usability (I) 65 64 , 68, 69
• Time to complete a task, percentage of tasks completed by unit time (speed)
• Ratio of successes to failures
• Time to spent on errors
• Number of other SW performing the same tasks better
• Number of commands used
• Frequency and duration of help or manual use
• Percentage of favorable to unfavorable user commands
Acc. Tyldesley (1998)
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
12
Some Objective Criteria to Measures Usability (II) 65 64 , 68, 69
• Number of repetitions of failed commands
• Number of times the interface misleads the user
• Number of available commands which are rarely used
• Number of users preferring this system
• Number and duration the user works around a problem
• Number of times the user loses control of the system
• Number of times the user expresses frustration or satisfaction
Acc. Tyldesley (1998)
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
14
GUI Evaluation ( Test) 767
• Why?
- Improvement of SW usability user (customer) satisfaction, …
- GUI design rules are not sufficient
- Developers and users have different knowledge, …
- There is no average user, …
• When?
- In each step of the GUI design process
- Fixing problems early in the design process is significantly cheaper
than doing this after shipment
• How?
Different methods and strategies (this chapter)
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
15
GUI Evaluation Test Measurements & Purposes 777
• Conformance with requirements
• Conformance with guidelines for good design
• Conformance with legal requirements
• Identification of GUI design problems
• Ease of learning
• Retention of learning over time
• Speed of task completion
• Error rates
• Subjective user satisfaction
• …
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
16
GUI Evaluation Steps 19 - 23
• Identify the purpose and scope of testing
• Development of a prototype (e.g. at each stage of the design process)
• Development of a test plan to yield the relevant data
• Selection and scheduling users to participate
• Providing a proper test facility (room, computer, …)
• Conduction of tests incl. briefing to SW purpose,
in total not longer than 2 h per session
• Collection of data
• Analyze test results and design update recommendations
• Evaluation of the SW to ship
• Monitor user experience after shipment
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
17
GUI Evaluation Test Methods 777
1. Heuristic evaluation by GUI experts
2. Cognitive walk-throughs
3. Classic usability tests with users
• …
Cost, time, …
order to test
Oher references:
- http://research.fit.edu/carl/documents/MHET.pdf
- http://en.wikipedia.org/wiki/Cognitive_walkthrough
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
18
Heuristic Evaluation 780
• Description: Evaluation by GUI experts to identify problems
• Advantages: Easy to do, low cost, doesn‟t waste user‟s time,
can identify many problems
• Disadvantages: Experts may not possess an adequate understanding of the
tasks and user issues, difficult to identify system wide
problems, no systematic way
• Guidelines: Use three to five experts, experts should be familiar with the
project and have a long-term relationship with the company.
Ratings: 0 = no usability problem at all
1 = Cosmetic problem only, need not to fix
2 = Minor problem, fix with low priority
3 = Major problem, important to fix
4 = Usability catastrophe, fix before further tests
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
19
Heuristic Evaluation Checklist
Heuristic_Evaluation_Checklist_stcsig_org.pdf
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
20
10 Usability Heuristics for User Interface Design by J. Nielsen
780 1. Visibility of system status
The system should always keep users informed about what is going on,
through appropriate feedback within reasonable time.
2. Match between system and the real world
The system should speak the users' language, with words, phrases and
concepts familiar to the user, rather than system-oriented terms.
Follow real-world conventions, making information appear in a
natural and logical order.
3. User control and freedom
Users often choose system functions by mistake and will need a
clearly marked "emergency exit" to leave the unwanted state without
having to go through an extended dialogue. Support undo and redo.
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
21
10 Usability Heuristics for User Interface Design by J. Nielsen
780 4. Consistency and standards
Users should not have to wonder whether different words, situations,
or actions mean the same thing. Follow platform conventions.
5. Error prevention
Even better than good error messages is a careful design which
prevents a problem from occurring in the first place.
Either eliminate error-prone conditions or check for them and
present users with a confirmation option before they commit to the action.
6. Recognition rather than recall
Minimize the user's memory load by making objects, actions, and
options visible. The user should not have to remember information from
one part of the dialogue to another. Instructions for use of the system
should be visible or easily retrievable whenever appropriate.
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
22
10 Usability Heuristics for User Interface Design by J. Nielsen
780 7. Flexibility and efficiency of use
Accelerators -- unseen by the novice user -- may often speed up the
interaction for the expert user such that the system can cater to both
inexperienced and experienced users. Allow users to tailor frequent actions.
8. Aesthetic and minimalist design
Dialogues should not contain information which is irrelevant or rarely
needed. Every extra unit of information in a dialogue competes with the
relevant units of information and diminishes their relative visibility.
9. Help users recognize, diagnose, and recover from errors
Error messages should be expressed in plain language (no codes),
precisely indicate the problem, and constructively suggest a solution.
10. Help and documentation
Even though it is better if the system can be used without documentation,
it may be necessary to provide help and documentation.
Any such information should be easy to search, focused on the user's task,
list concrete steps to be carried out, and not be too large.to fix
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
23
Cognitive Walk - Throughs 786
• Description: Reviews of the GUI in the context of user tasks
• Advantages: Allow a clear evaluation of the tasks flow early in design
process, do not require a functional prototype (paper, ..),
low cost, can be used to evaluate alternate solutions,
more structured than heuristic evaluation
• Disadvantages: Fatiguing to perform, may miss inconsistencies and
general problems
• Guidelines: - Use clear description of the GUI and its tasks
- Limit test to several core or representative tasks
- Limit sessions to 60 min.
- Use about 10 GUI experts
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
24
Cognitive Walk - Throughs 786 Source: http://en.wikipedia.org/wiki/Cognitive_walkthrough
Typically four questions are asked:
1. Will the user try to achieve the effect that the subtask has?
Does the user understand that this subtask is needed to reach the user's goal?
2. Will the user notice that the correct action is available?
E.g. is the button visible?
3. Will the user understand that the wanted subtask can be achieved by
the action? E.g. the right button is visible but the user does not understand
the text and will therefore not click on it.
4. Does the user get appropriate feedback? Will the user know that they have
done the right thing after performing the action?
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
25
Cognitive Walk - Throughs 786 Source: http://en.wikipedia.org/wiki/Cognitive_walkthrough
By answering the questions for each subtask usability problems will be noticed.
Common mistakes
1.The evaluator doesn't know how to perform the task themself,
so they stumble through the interface trying to discover the correct
sequence of actions - and then they evaluate the stumbling process.
(The user should identify and perform the optimal action sequence.)
2.The walkthrough does not test real users on the system.
The walkthrough will often identify many more problems than you would find
with a single, unique user in a single test session.
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
26
Classic Usability Tests 786
• Description: Evaluation under real-world conditions,
measures performance, can be used to compare alternatives,
major GUI problems are identified and fixed
• Advantages: Utilizes an actual work environment, objective measurement
of performance, subjective measurement of user satisfaction,
identifies serious or recurring problems
• Disadvantages: High cost, requires controlled and exactly planned tests,
emphasize first-time GUI usage
• Guidelines: - Use about than 10 people per user group
- Simulate real world conditions as far as reasonable
- Limit sessions to 90 - 120 min. (limits also test procedures)
- Apply statistical methods for data analysis
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
27
Heuristic vs. Usability Testing 794
Heuristic Usability Testing
Addresses this
question
Is the design optimized in
terms of GUI rules?
Can the user find the
information and how
does he performs
tasks?
Recommen-
dations are
derived from
Standards, guidelines, best
practice, GUI experts
Direct observation of
users performing tasks
Complementary
benefits
Focuses on what the design
brings to users
Focuses on what the
users brings to design
Benefits
- Rapid results
- Comprehensive evaluation
- Low cost
Recommendations to
specific objects, tasks
and limitations by users
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
28
Some Myths about GUI Usability Testing (I) Weinschenk 107
• An expensive usability lab is needed
You need at least a quite room with a system running the GUI – this captures
most of the GUI design issues.
Tests should be also performed the typical environment incl. „stress‟
• GUI tests should performed at the end the development before shipping
One should definitively do this, but tests should be performed at every
major step of the GUI design process
• Every feature of the GUI should be tested
Generally speaking yes, but it is impossible to get everything tested by a
single person/subject (e.g. MS WORD, beta tests with high profile users).
Focus on clusters of tasks and test them separately
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
29
Some Myths about GUI Usability Testing (II) Weinschenk 107
• 100+ people are needed
You need at least 10 subjects. If they are truly representative, they will
capture about 95% of the problems
• A „good‟ test means no changes are necessary
The only criterion for a successful GUI test is, that you get a lot of good data
for improving the GUI. Negative feedback is as well very helpful
• GUI usability testing is the same as system testing
No – you are not trying to find bugs in the software. The GUI test addresses
how easy the SW can be learned and how fast tasks can be performed
• GUI usability testing is the same as design review
No – design reviews are valuable for your bosses. Design review means
more presentation than testing
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
30
Examples: Collect Data (I)
1) Create test scenario
Task: Collect Data
Your situation:
You have six measurement data, each on a different sheet
of paper, type those data into the software
Your job:
- Enter each dataset into the software
- Check if there is no typing error
- Store data
- Display all data of the last four weeks in a graph
- Make a print-out of the graph
- Sent an email with the plot to the QA manager
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
31
Examples: Collect Data - Example
What are potential problems of the job from test scenario to perform?
- Enter each dataset into the software - Check if there is no typing error - Store data
-Display all data of the last 4 weeks in a graph - Make a print-out - Sent an email with the plot to the QA manager
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
32
Examples: Collect Data (II)
2) Develop a usability test plan (test duration max. 2h !)
- Point out the goals of the test (the goal is to test the performance of …)
- Schedule for the tests (date, participants, time, …)
- Testing method (What will be tested by which method?)
- Select and invite participants (they should represent user groups)
- Test procedure (explain everything step by step)
3) Prepare test materials
- Prototype of GUI and systems on which it
- Provide cards or a list with brief instructions of each test
Good: “You should enter new measurement data”
Bad: “Choose „New Data‟ from the File Menu”
- Evaluation form for the participant (see below)
- Observer instructions and observer evaluation form (see below)
- Other things like video recording equipment, …
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
33
Examples: Collect Data (III)
Evaluation form for the participant Evaluation form for the observer
Pretest
- Age:
- Gender:
- PC experience:
- Pretest
- Is participant relaxed?
yes / no / don‟t know
Questions for doing statistics and are
often not really helpful because of
about 10 participants.
Comments are more suitable!
Post test
- How do you rate this SW?
good / fair / bad
During test
- Problems that came up:
- Errors that came up:
- Time to perform task: … min.
Post test
- Things you like:
- Things you don‟t like:
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
34
Examples: Collect Data (IV)
ISONORM - Questionnaire for software ergonomics (DIN 9241 part 10)
5 Fragen zu jeder der 7 Gestaltungsgrundsätze der Norm stellen:
- Aufgabenangemessenheit
- Selbstbeschreibungsfähigkeit
- Fehlertoleranz
- Lernunterstützung
- Erwartungskonformität
- Individualisierbarkeit
Bsp.
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
35
Examples: Collect Data (V)
4) Perform the test (max. 2 h !)
A. Greet the participant and introduce yourself
B. Briefing
C. Have the participant to fill out the pretest form
D. Conduct the test while filling out the evaluator form
E. Have the participant to fill out the posttest form
F. Have the posttest interview
G. Thank the participant
5) Report of evaluation
- Group, explain and prioritize the results
- Provide a recommendation how to fix major problems
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
36
Examples: Collect Data (V)
4) Perform the test (max. 2 h !)
A. Greet the participant and introduce yourself
B. Briefing
C. Have the participant to fill out the pretest form
D. Conduct the test while filling out the evaluator form
E. Have the participant to fill out the posttest form
F. Have the posttest interview
G. Thank the participant
5) Report of evaluation
- Group, explain and prioritize the results
- Provide a recommendation how to fix major problems
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
38
Group Work : Case Study VII (Master)
Topics for presentation, GUI prototyping and final documentation
- Perform evaluation tasks: Heuristic evaluation by GUI experts and
Cognitive walk-throughs of your Case Study: performed by another team
- Update your GUI
Final (Exam) Presentation (~ 30 min)
- Presentation of idea and type of GUI (see first case studies) (3 min)
- Presentation of 1-2 screen “initial” vs. “final” (2 min)
- Presentation of final GUI prototype (10 min)
- Classic usability tests with 3 users (not out of your team) should performed
during presentation (5 min subtask usability test each = 15 min)
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
39
Summary of GUI Design Evaluation
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
40
Summary of GUI Design: User vs. Project Expectations
D I S P L A Y L A BPFORZHEIM UNIVERSITY
D I S P L A Y L A BPFORZHEIM UNIVERSITY
Graphical User Interface: Evaluation
Blankenbach / Pforzheim Univ. / www.displaylabor.de / IT / GUI / Nov 2016
M E/I S
41
Summary of GUI Design
• GUI design flow
• User objects
• Evaluation and test
XXX software
is ranked
YYY Magazine