Contacting respondents for survey research: Is email a useful method?
Transcript of Contacting respondents for survey research: Is email a useful method?
Contactingrespondents forsurvey researchIs email a useful method?
Joanna d’Ardenne and Margaret Blake
November 2012
Background
2
Is email a useful way ofcontacting potential respondentsabout surveys?
• Might it help increase response?
• Might it reduce non-response bias?
• Might it save money?
• Might it allow more frequent contact on longitudinalstudies?
The question
3
Background
Traditionally advance letters used to contactrespondentsBut the way in which people communicate haschangedUse of letters declining and nearly 50% of lettersdelivered are marketingIncreasingly public bodies communicate onlineIncrease in postage costs
4
Internet and email use
2012 – 80% of population use the internet2010 – 90% of internet users use email2012 - 67% of internet users use internet every day2011 - 45% connect to internet using mobile devices2011 - 27% submit forms on line2011 – 32% look at public authority websites
5
But…
Much email traffic is also marketingDo people actually prefer to be contacted by letter?Increasing use of social media
60% women in 201254% men in 2012
Non-users are not random:e.g. 36% of single people aged 65+ are internet users2008 – highest income households more than three times as likely tohave internet access than lowest
2011 – 21% think they do not have sufficient knowledge to protecttheir personal dataEmail addresses may be shared
6
Why contact respondents byemail?
For follow-up study.
For next wave on a longitudinal study.
To provide feedback/ findings from study.
For panel maintenance – e.g. asking for address updates.
To inform respondents about appointments etc mid study.
Conducting quality checks.
7
BackgroundNatCen collected email on some surveysNo standard question across surveysVarying effectiveness in obtaining email address
Wanted to find out:Best way of collecting email addressesWhat respondents feel about itAccuracy of email addressesHow respondents react when contacted by email
8
Three tests….
Test 1: Cognitive testing of email address requests
Test 2: Piloting email address requests on Omnibus
Test 3: Validation and response exercise
CognitiveTesting of Emailrequests
10
Background and aims…
The aims of the cognitive testing were to collectqualitative information on...
How respondents felt about being asked to provide anemail address
Reasons why respondents may refuse to provide anemail address
11
Methods
Cognitive testing of email requests ‘piggy-backed’ on to threedifferent question testing projects:
Understanding SocietyWelsh Health SurveyEuropean Social Survey
Requests administered in a ‘real-life’ contextRespondents thought requests were legitimate rather than part of thequestion testingAfterwards respondents were asked a number of probes about theirviews on the email request
69 respondents took part in totalSampling and recruitment varied by project
12
The questions tested…
EM1: From time to time NatCen contacts people for further research and may wishto contact you again about a future study. Would this be all right?• Yes• No
EM2:If we contact you to see if you are willing to help us again we may use email.Do you have an e-mail address we can contact you on? Your email address wouldonly be used for research purposes and would not be passed on to anyone outsideNatCen.• Yes• No- have not got email.• No – do not wish to give email address
EM3: What is your e-mail address?
13
Reasons for not providing anemail…
All respondents understood the request:Asking for an email address was not considered unusual orinappropriate
Reasons for reticence or refusal to provide an emailaddress were:
Not having an email account or having limited access to acomputer (generally from older respondents)Concerns about the volumes of emails they might receive
Recommendation: Respondents could be frustrated if they receive too manyemails on follow-up studies. Procedures need to detail the maximum number ofrequests we should send in a given time period. Interviewers could beprovided with an instruction to reassure respondents about this if it is required.
14
Reasons for not responding to anemail
Emails not being received at a convenient time
Emails being deleted from shared accountsRespondents described having joint email accounts withpartners or family members. Other account users maydelete emails thinking them as unsolicited
Data security concerns
Recommendation: Ideally follow-up emails should specify the name of theperson to whom they are directed to, to reduce the likelihood that they will bedeleted or responded to by another person.
15
How do I know that if you sendme an email it has really comefrom you?Female, Aged 36-59
Recommendation:
Respondents raised concerns about data security inrelation to email requests to take part in surveys i.e. thatemails could be sent by anyone. This suggests that othermodes of contact should be considered in conjunction withweb to confirm the legitimacy of the email request.
Data Security Concerns
Piloting emailrequests on theOmnibus
17
Background and aims…
The aims of this test were to pilot the email request in areal survey setting to ascertain…
what proportion of respondents provide an emailaddress in a survey context
the characteristics of respondents who agree to givean email address
SexAge group
18
Methods
Questions were piloted on the NatCen Omnibus Survey (2011)Random probability PAF sampleFace-to-face CAPI interview with one adult in the hhld
Questions were asked at the end of the interview of 1,839 RsQuestions tested were the same as those used in cognitivetesting
Interviewers had to enter the email address twice as a form of checkIf there was any difference between the two email address entries anerror message appeared
19
Key findings 1
83% of respondents consented to some form of re-contact… Of these:
43% agreed to contact by email34% did not have an email address23% did not want to be contacted by email
Put another way 35.5% of Omnibus respondentsgave permission to be contacted by email
20
Breakdown of responsesto the email request
16.9%
35.5%19.5%
28.1%
0.1%No permission to re-contact
Permission to email
Do not wish to give emailaddress
Have not got email
Don't Know
N=1,839, unweighted data.
21
Key findings 2
There was no clear pattern between sex and providingan email address
Amongst those willing to be re-contacted olderrespondents were less likely to provide an emailaddress:
56% of 16-24 year olds27% of 65-74 year olds16% of 75+ year olds
Reasons for not providing an email address varied byage…
22
Response toEM2 by age-group
0%
10%
20%
30%
40%
50%
60%
70%
80%
16-24 25-34 35-44 45-54 55-64 65-74 75+
Yes
No email
Not willing
N=1,527 unweighted data.
Validation andresponseexercise
24
Background and aims…
The aim the validation and response exercise was toascertain…
Percentage of emails given which were validThe length of time that elapses between sending anemail request and receiving repliesHow many of the Omnibus respondents would readand respond to an email from NatCenThe characteristics of the respondents who respondto the email
SexAge group
25
Methods
Omnibus respondents who provided an email address during theirOmnibus interview were sent an email (n=544)
The email was sent 5-7 months after the Omnibus interviewThe email asked the respondents to click on a link
Emails were sent via a bulk email distributor ‘Constant Contact’.We were able to monitor:
‘Bounce-backs’ and reasons for emails not being deliveredWhether respondents opened the email (even if they didn’t respond)Whether respondents clicked on the linkResults were collated after 30 days
26
Email ContentFrom: NatCen OmnibusEmail heading: Thank you for taking part in our study
{Dear INSERT TITLE/ SURNAME}You recently took part in our ‘Topical Issues’ study. At the end we asked youfor an email address we could contact you on. This was the first time wehave done this and we would like to check whether our procedures forcollecting and using email addresses work.Please click on the below link to confirm you received this emailhttp://www.natcen.ac.uk/emailtesting/index.htmClicking on the link confirms to us that the interviewer recorded your emailaddress correctly. We are not asking you to take part in further research.We will not contact you again about this email address test.Thank you for your help.Jo d’ArdenneSenior Researcher. National Centre for Social ResearchEmail: Joanna.d’[email protected] more information about the research we do please visit our website:www.natcen.ac.uk
27
Key findings 1
If emails were opened this tended to happen soonafter the test email was sent:
53% of openings occurred on the same day the test emailwas sentBy the 7th day 89% of the openings had occurredBy the 14th day 98% of the openings had occurredRespondents who clicked on the link typically did so onthe same day they opened the email
28
Diminishing returns each day
40
50
60
70
80
90
100
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29
Days
Cum
ulat
ive
% o
f em
ails
ope
ned
Cumulative % of when each email was opened
Recommendation: Reminders, if used, should be sent 1-2 weeks after the initial emailrequest. It would be useful to do further tests to establish the effect of reminders sent atdifferent times using different modes.
29
Key findings 2
The majority of the emails sent were not opened…
Of the 544 emails sent10.5% were not delivered (bounce-backs)51.8% were delivered but not opened0.6% were opened but the respondent declined to click onthe link28.1% of validation respondents clicked on the link
Therefore, in total 10% of Omnibus respondentsclicked on the link
30
Results of validationand response test
N=544
10.5%
51.8%9.6%
28.1% Email not delivered
Email delivered but notopened
Email opened but link notclicked on
Email opened and linkclicked on
31
Reasons for ‘bounce-backs’ 1
10.5% of emails were ‘bounce-backs’6% bounced due to non-existent email addresses4% bounced due to other reasons (email was blocked,inbox was full, other unknown reasons)
Non-existent addresses could be due to two reasons:Respondent error when giving the address (eitheraccidentally or as a covert means of refusal)Interviewer error in typing the address
32
Reasons for ‘bounce-backs’ 2
The 35 non-existent email addresses were checked byhand. Errors in commonly used domain name werespotted in 5 cases e.g.
@otmail (meant to be @hotmail)@couk (meant to be @co.uk).
Errors were entered twice indicating the check questionis not always working as intended
Recommendation: The procedures for briefing interviewers should bereviewed to stress the importance of accuracy. It could be worthwhile gettingrespondents to verify their own email addresses on the interviewer’s laptop.Likewise it may be worth considering some form of incentive for interviewerscollecting accurate email addresses.
33
Emails not being opened51.8% of emails were delivered but not opened…
We speculate that respondents either:Did not see the email because it was sent to a junk emailbox via a spam filter; orIgnored the email or deleted it without opening it.
Recommendation: As we have no control over respondents spam filters thestrategy we should pursue is encouraging respondents to open the email in thefirst place. The email title will be important for this and could merit furtherinvestigation. It would be possible to run a split-ballot experiment where emailsare sent with different titles to see what impact, if any, title has on response.
34
Not clicking the linkThe majority of respondents who opened the email went onto click on the link (approx 75%).
9.6% of test respondents opened the email but did not click onthe link.
We speculate reasons for not clicking the link could be:Concern about the legitimacy of the request (i.e. whether theemail was actually from NatCen)Concern that responding could lead to additional email requestsLack of interestOpening the email at an inconvenient time (e.g. whilst working)
Recommendation: Further investigation could be done to look at whether email textcan both reassure respondents and motivate them to take part.
35
Key findings 3Males more likely to open the email than females
44% of males opened the email34% of females opened the email
Although more younger people provided an emailaddress older people were more likely to open theemail and reply to it…
27% of 16-24 year olds opened the email (with 16%replying)51% of 64-74 years olds opened the email (with 49%replying)
There was a drop off in opening and clicking in the 75+age group…
36
Opening and clicking by age group
0%
10%
20%
30%
40%
50%
60%
16-24 24-34 35-44 45-54 55-64 65-74 75+
Age group
Resp
onde
nts
Opened emailClicked on link
Discussion…
38
Discussion 1Could email be used in follow-up studies of PAF samples?
ONLY in combination with other methodsIs this cost effective?
Further work is required to establish the likely response rates tofollow-up surveys by email:
Subject saliencyIncentivesNumber of remindersMode of reminders (F2F/Telephone/Mail)Variations in email title ??
39
Discussion 2Will findings transfer to longitudinal/ panel surveys?
Cost effective?
What training should we give interviewers in relation to collectingemail addresses?
Accuracy?Reassurances?How often should we email respondents?
What other new methods should we be considering?Mobile devicesSocial media
If you want further information orwould like to contact the authors,Joanna d’ArdenneSenior Researcher
T. 020 7549 7108
E. joanna.d’[email protected]
Visit us online: www.natcen.ac.uk
Thank you