Gamifying. Not all fun and games - Breaking Blue 2020-01-08آ  gamification is weaker,...

download Gamifying. Not all fun and games - Breaking Blue 2020-01-08آ  gamification is weaker, specifically:

If you can't read please download the document

  • date post

    16-Aug-2020
  • Category

    Documents

  • view

    0
  • download

    0

Embed Size (px)

Transcript of Gamifying. Not all fun and games - Breaking Blue 2020-01-08آ  gamification is weaker,...

  • Are We There Yet? Where Technological Innovation is Leading Research Proceedings of the Association for Survey Computing, Volume 7. Edited by T. Macer et al. Compilation © 2016 Association for Survey Computing

    Proceedings of the Association for Survey Computing, Volume 7 1

    Gamifying. Not all fun and games

    Phil Stubington, Charlotte Crichton

    Abstract

    There has been a lot of attention on engaging online participants to address declining response rates and

    the risks of poor quality data. Participant availability via panels has plateaued, while completion rates

    fall and many surveys appear poorly designed. This raises real concerns about the representativeness of

    participants who are recruited from online panels or client email lists and the quality of their data. This

    paper reviews evidence from experiments we have run using gamification, some examples of the latest

    techniques for gamification and considers the practical challenges of implementing gamified surveys

    for agencies, clients and panel providers.

    Keywords

    Gamification; engagement; online research; response rates

    1. Introduction

    The challenge facing market research

    Maximising the participant experience and thus response rates and data quality are not new topics and

    engaging participants has always been vital to market research. Furthermore, there is a strong body of

    evidence that suggests this task is becoming harder not easier over time. The clearest indicator for this

    is that response rates are falling in most markets. According to analysis published by Pew Research, the

    response rate of a typical telephone survey in the United States was 36 percent in 1997 and was just 9

    percent in 2015. In the United Kingdom, a paper by Ipsos MORI as long ago as 2008 reported that the

    response rate for the National Readership Survey had fallen from 73.4 percent in 1974 to 51.6 percent

    (although the situation does now appear to have stabilised with only a further two percent decline since

    then). Similar analysis of the UK Labour Force Survey (Barnes W, Bright G & Hewat C (2008)) has

    shown a 21 percent decline in response rates over a fifteen-year period.

    The move to on-line research should have altered the paradigm for response rates since (at least for the

    panel providers such as Lightspeed GMI, Toluna, Research Now and SSI) participants are effectively

    pre-screened concerning their willingness to cooperate by the act of signing up to panel membership.

    However, online research still has to compete with the numerous other interesting things that the internet

    has to offer; on-line samples often vary hugely in the manner in which they are recruited and there is a

    wide variety of different approaches to contacting participants of varying efficacy. Furthermore, on-line

    survey invitations have to convince the participant to cooperate without the intervention of an

    interviewer who can answer questions concerning privacy, data protection or the purpose of the survey.

  • Phil Stubington, Charlotte Crichton

    2 Are We There Yet? Where Technological Innovation is Leading Research

    Data concerning on-line survey response rates is inconclusive, for many reasons including (but not

    limited to) commercial sensitivity amongst the major panel providers, issues of calculating a response

    rate for some types of recruitment such as river sampling1 and the wide variety of sample sources. In a

    literature review of on-line response rates, Schonlau M, Fricker R & Elliott M (2002) commented,

    “studies on the use of the Web as a response mode vary widely in terms of the nature of their target

    populations, how participants are recruited, and whether any attempts at statistical adjustment are made

    in the studies’ analyses.” The paper pointed out that even amongst broadly similar surveys conducted

    by the U.S. Census Bureau, response rates ranged from 27 percent to 75 percent. A 2008 paper by

    Baruch and Holtom reported an equally wide range of response rates in the UK (the average response

    rate for studies that utilised data collected from organisations was 35.7 percent with a standard deviation

    of 18.8 percent).

    An admittedly unscientific web search by the authors of this paper suggests that response rates between

    15 percent and 25 percent are increasingly regarded as the norm for on-line surveys, which would

    suggest that on-line surveys have joined their telephone and face-to-face counterparts as being

    challenged by poor response rates.

    The industry’s response

    Because of this trend, and the need to ensure high quality data more generally, a number of the major

    panel providers and various research agencies have invested significant time and effort in testing the

    impact of improving questioning techniques and/or enhancing the visual appearance of surveys.

    The work of Jon Puleston of Lightspeed GMI is probably best known in this regard, but many conference

    papers have now been presented on this topic, which between them provide a considerable body of (not

    wholly consistent) evidence.

    In industry parlance, these approaches have tended to be grouped under the umbrella heading of

    ‘gamification’. Mavletova (2014) summarised the main elements of a gamified survey as: (1) stating

    clear rules and goals for the participants; (2) involving participants with a relevant and entertaining

    narrative; (3) maintaining motivation by providing interesting and achievable tasks or quests; and (4)

    giving feedback on the progress and rewards for accomplishing tasks and answering questions.

    In practice, gamification is often conflated with the use of a variety of techniques to improve the

    aesthetic of a survey. However, this need not be the case, since many gamification techniques can be

    used for text questions (Cape (2016)). Equally, it is possible to improve the visual appearance of a

    survey, for example using images to replace answer lists, without drawing upon game techniques. For

    the remainder of this paper, for convenience, we use the term gamification in the widest sense to cover

    both question wordings and the visual aesthetic of the surveys.

    1 Where potential participants are recruited through pop-ups and promotions on various web sites with the survey

    normally undertaken immediately

  • Gamifying. Not all fun and games

    Are We There Yet? Where Technological Innovation is Leading Research 3

    2. Our evidence to support gamifying surveys

    Why we decided to run our own experiment

    Gamification as a means to improve online surveys has been talked about for almost a decade and

    numerous experiments looking at the impact of gamification on the participant experience have been

    conducted.

    As early as 2008, Puleston & Sleep reported the following benefits:

    1. Less straight-lining: up to 80 percent lower levels in some experiments,

    2. Lower neutral scoring: average 25 percent lower.

    3. Lower dropout (if questions are designed ergonomically): able to reduce from 5 percent to 1 percent

    in test experiments.

    Gamification has received considerable exposure at industry conferences and a significant number of

    papers have been presented concerning its effectiveness. Puleston (2012) described it as “the most

    powerful and effective means we have ever come across to encourage participants to put more thought

    and effort into taking part.”

    Nevertheless, as researchers, we should always be aware of publication bias and at least one paper by

    Koenig-Lewis, Marquet & Palmer (2013) has cast doubt on a number of the claimed benefits of

    gamification including improved response rates and elements of the participant experience. Even its

    leading advocate has commented, “There is also the difficulty of squaring off the objectives of a piece

    of research with the objectives of a game. Often, when we have thought about putting some of these

    ideas into practice, we have found the two can lead you in different directions” Puleston (2012).

    Additionally, survey context is important and examination of the published case studies and conference

    papers led to us to conclude that there are certain types of survey where the evidence base for

    gamification is weaker, specifically:

     B2B surveys (or mixed B2B and consumer surveys)

     Global surveys involving developing markets

     Surveys involving the need to provide accurate behavioural data rather than brand perceptions

    (which by their nature are less tangible).

    Since these surveys constituted a significant proportion of our online research, we felt that the evidence

    base was still insufficient to recommend this approach to many of our largest clients.

    Our experiment

    Therefore, working with Lightspeed GMI, we commissioned an experiment based on an existing project

    which includes a mixture of consumer and B2B interviews, is global in nature, and whose main objective

    is to provide detailed behavioural information. In addition, we believed this survey provided a particular

    challenge as the topic (purchase and usage of printer ink or toner cartridges) is of little intrinsic in