LAK16 privacy and analytics (2016)

31
http://bit.ly/DELICATE

Transcript of LAK16 privacy and analytics (2016)

Page 1: LAK16 privacy and analytics (2016)

http://bit.ly/DELICATE

Page 2: LAK16 privacy and analytics (2016)
Page 3: LAK16 privacy and analytics (2016)

Learning Analytics

HE Managers see:• Promise vs. Concerns• Potential vs. Risks• Benefits vs. Cost• Purpose vs. Competitive Pressures• Intentions vs. Hesitations

• Leading to Confusion

HEIs: How to implement LA?

Page 4: LAK16 privacy and analytics (2016)

Institutionalising LA

Recent set-backs in education:• (1) inBloom• (2) Snappet

Acceptance factors: Data subjects being sufficiently aware of the consequences of using the system, the validity and relevance of the results obtained, and the level of transparency of the data model.

Page 5: LAK16 privacy and analytics (2016)

6

• $100 million investment • Aim: Personalized learning in public schools, through data & technology

standards • 9 US states participated, in 2013 data about millions of children

have been stored

Privacy as Show Stopper for LA

Page 6: LAK16 privacy and analytics (2016)

7

Privacy as Show Stopper for LA

Ignoring the fears and public perception of the application of

analytics can lead to a lack of acceptance, protests, and even failure

of entire LA implementations.

Page 7: LAK16 privacy and analytics (2016)

8

Related Research Work

Prinsloo & Slade (2013)Slade & Prinsloo (2013) Pardo & Siemens (2014) Prinsloo & Slade (2015) Hoel & Chen (2015) Sclater, Bailey (2015) Steiner, Kickmeier-Rust, Albert (2015)

Page 9: LAK16 privacy and analytics (2016)

10

Related Policies• HEI law on data collection in NL like in all

EU countries since Nuremberg trials (data collection allowed to improve education, clear purpose, consent, limited access)

• Engelfriet, A., Jeunink, E., Manderveld, J. (2015). Learning analytics onder de Wet bescherming persoonsgegevens. https://www.surf.nl/kennis-en-innovatie/kennisbank/2015/learning-analytics-onder-de-wet-bescherming-persoonsgegevens.html

• SURF follow-up report with use cases in preparation

Page 10: LAK16 privacy and analytics (2016)

Research Ethics

Origins lie in post-WWII. – Milestones:

• Nuremberg Code (1949)• Helsinki Declaration (1964)• Belmont Report (1978)• 2000s: Biomedical Science• RRI (Responsible Research and Innovation)

Page 11: LAK16 privacy and analytics (2016)

Facebook Study

Page 12: LAK16 privacy and analytics (2016)

Privacy

• The right to be left alone (Westin 1968)• Informational self-determination (Flaherty

1989)• Informational, decisional, local privacy

(Roessler 2005)

• Privacy is not anonymity or data security!

Page 13: LAK16 privacy and analytics (2016)

Privacy

Contextual Integrity vs. Big Data Research

Context bound information vs. Repurposing of data

Page 14: LAK16 privacy and analytics (2016)

Who are the Bad Guys?

ME&

MY DATAGovernment? Commerce?

Education? Hackers & Bad Guys?

Page 15: LAK16 privacy and analytics (2016)

Legal Frameworks

• EU Data Protection Directive 95/46/EC (automated processing of ‘personal data’)2016: General Data Protection Regulation (GDPR)

• Restricting the (re-)use of data vs. and contradicting• Big Data business models• European Data Retention Directive 2006/24/EC

(data storage for security purposes)

Page 16: LAK16 privacy and analytics (2016)

17

Legal Frameworks

Page 17: LAK16 privacy and analytics (2016)

18

Legal Frameworks

Page 18: LAK16 privacy and analytics (2016)

Fears

• Power-relationship, user exploitation• Data ownership• Anonymity and data security• Privacy and data identity• Transparency and trust

Page 19: LAK16 privacy and analytics (2016)

Power-relationship

• Tracking and id-ing users or citizens by state or corporations (e.g. insurance companies, banks, car manufacturers, etc.) = benefit not to the user!

• Power-relationship is asymmetrical

Page 20: LAK16 privacy and analytics (2016)

Exploitation

• Free labour as business model of for-profit companies

• Crowd sourcing outside the „commons“

Page 21: LAK16 privacy and analytics (2016)

Data Ownership

Page 22: LAK16 privacy and analytics (2016)

Data Ownership

Page 23: LAK16 privacy and analytics (2016)

Data Ownership

Page 24: LAK16 privacy and analytics (2016)

Anonymity and Data Security

• No absolute anonymity or de-identification

• Integration of multiple data sources increase compromised personal identity

• Data stores are not 100% secure

Page 25: LAK16 privacy and analytics (2016)

Privacy and Data Identity

• System identity vs. Social identity

• People approximated onto data models by probability

• Power: data subjects have no say in the design of data model

Arora, P. (2016). Bottom of the data pyramid: Big data and the global South, International Journal of Communication

Page 26: LAK16 privacy and analytics (2016)

Transparency and Trust

• Assumption: more transparency = more trust• But: relationship is mostly asymmetrical

(individual vs. big corporation)• Transparency as instrument of control

Page 27: LAK16 privacy and analytics (2016)

Introducing: DELICATE

Page 28: LAK16 privacy and analytics (2016)

Introducing: DELICATE

Page 29: LAK16 privacy and analytics (2016)

Introducing: DELICATE

Page 30: LAK16 privacy and analytics (2016)

Introducing: DELICATE

Page 31: LAK16 privacy and analytics (2016)

Call for Papers

Special Issue: Journal of HE Development (ZfHE)

Learning Analytics: Implications for Higher Education

http://bit.ly/1qXTaNzDeadline: 10 June 2016Publication date: Spring 2017