On Evaluating Serious Games

14
On Evaluating The 80 Days Geography Game For School Students Mahmoud Abdulwahed

Transcript of On Evaluating Serious Games

Page 1: On Evaluating Serious Games

On Evaluating The 80 Days Geography Game

For School Students

Mahmoud Abdulwahed

Page 2: On Evaluating Serious Games

Pre Evaluation

• Understand the utilized pedagogical theories.

• Understand the utilized cognitive theories, i.e. the CbKST.

• Understand the game architecture, in particular:– The adaptive engine– The Narration engine

Page 3: On Evaluating Serious Games

• Pedagogical • Usability • Cognitive • Affective• Economy • Others: such as Neuroscience,

the gaming factor, etc.

The DEGs’ Evaluation Dimensions

Page 4: On Evaluating Serious Games

The DEGs Learning Outcomes

• Intended learning outcomes, such as:– Countries capitals– Demographic information– Climate states– and any such intended geography learning

outcomes by the school curricula

• Occasional, or un intended learning outcomes (nevertheless its still learning !), such as:– Organizational skills– Decision making skills– Strategic skills– and any such generic learning outcomes as a

result of playing digital games

Two Types of Learning Outcomes are Associated with DEGs

Page 5: On Evaluating Serious Games

Pedagogical Variables of DGs

Generic Learning Types Associated with Digital Games (Prensky 2001)

Page 6: On Evaluating Serious Games

Some Important Pedagogical Measures• Depth of the gained knowledge, i.e. deep vs.

surface learning. • Sustainability of learning (Direct impact on life-long

learning), i.e. do students seek to learn more about geography outside the game environment as a result of DEG based learning ?

• Breadth of learning outcome for DEGs vs. Classical Pedagogies.

• Time spent of the learning objectives.• Learning styles change? i.e. does it increase the

experiential learning style?• Social learning:

– Proximal, i.e. is the DEG an optimal platform for implementing effective social learning experience for students in the classroom or the lab?

– Cyber, i.e. online collaborative learning with DEG through the web

Page 7: On Evaluating Serious Games

Some Proposed Methods and Tools

• Analysing the game logged data• Questionnaire• Pre and Post tests• Comparative evaluation, i.e. classical

teaching vs. DEG based teaching.• Learning styles inventories.• Interviews• Focus groups• And many others

Page 8: On Evaluating Serious Games

Usability Problems of Digital Games

• Unpredictable / inconsistent response to user’s actions

• Does not allow enough customization• Artificial intelligence problems• Mismatch between camera/view and action• Does not let user skip non-playable content• Clumsy input scheme• Difficult to control actions in the game• Does not provide enough information on game

status• Does not provide adequate training and help• Command sequences are too complex• Visual representations are difficult to interpret• Response to user’s action not timely enough

Generic List of DGs Usability Problems (Pinelle et al 2008):

Page 9: On Evaluating Serious Games

Setting up The Usability Problems of DEGs

• Many usability problems of the generic DGs usability problems may apply for DEGs.

• Build upon the educational usability findings of educational software research.

• Extension towards the notion of DEGs Educational Usability will be necessary when evaluating DEGs usability, i.e. the combination of the learning elements make the game boring?, knowledge is difficult to navigate or retrieve?, schools does not have enough IT equipments for running the 3D games (high quality graphic cards)?, too much time needed to learn basic knowledge? etc.

Page 10: On Evaluating Serious Games

Usability Evaluation Methods

•Coaching Method

•Co-discovery Learning

•Performance Measurement

•Question-asking Protocol

•Remote Testing

•Retrospective Testing

•Shadowing Method

•Teaching Method

•Thinking Aloud Protocol

•Cognitive Walkthroughs

•Feature Inspection

•Heuristic Evaluation

•Pluralistic Walkthrough

•Perspective-based Inspection

•Focus Groups

•Interviews

•Logging Actual Use

•Questionnaires (QUIS, PUEU , NAU NHE, CSUQ, ASQ, PHUE, PUTQ, USE )

From http://www.usabilityhome.com/

Page 11: On Evaluating Serious Games

Methodology:

The Adoption, Adaptation, and Extension Principle

Usability Evaluation of DEGs

Page 12: On Evaluating Serious Games

Cognitive and Affective Variables

• Gender issues of the game perception?• Cognitive overload?

– The content dimension– The spent time dimension

• Information retaining:– Short term.– Long term.

• Like or dislike• Engagement• Addiction?• Self esteem• Preference in comparison to commercial educational games during

leisure times, i.e. is it alternative? Supplement? or no preference at all?• Confidence in geography• Motivation towards schooling

Page 13: On Evaluating Serious Games

Hybrid Evaluation Methods

Summary: Evaluating, How To?

• Data Analysis of the Game.• Pedagogical measurements methods

and instruments, i.e. the learning style inventory LSI.

• Cognitive measurements instruments, i.e. memory tests, mind maps, IQ tests, etc.

• Quantitative methods.• Qualitative methods.• Usability instruments.

Suggested Tools: Comparative Evaluations

XX

XX

Treatment

Pre Post

Experimental Group

Equivalentgroups

Different Outcome?

Control Group

YY

YtYt

Page 14: On Evaluating Serious Games

Conclusions• Objectives based evaluation. i.e. identifying the objectives of the game on all

perspectives and evaluating whether they met or not and to which extent.• In-Game and Out Game evaluation.• Surveying the potential current tools and models of evaluation (pedagogical, usability,

cognitive, etc) Adaptation Extension. Hence, developing Novel Models. • Inventing new models from scratch when needed.• Comparative evaluations when possible.• Evaluation during the design process phase, and final product evaluation.• Hybrid evaluation approach, i.e. quantitative and qualitative.• Embedding more science in explaining the evaluation findings, i.e. relating the findings to

the pedagogical theories and cognitive theories, explaining from systems dynamics and game theory perspectives.

• There is a lot of objectives and variables to be evaluated; load of evaluation experiments, methodologies and models that can be designed or used comprehensive work, BUT, a lot of high quality publications!.

• Robust and Holistic evaluation models would be developed after having more insight on the project, contacting with the partners, and getting closer view of the associated multidisciplinary domains.