Big Data: Law, Regulation & Ethics - kcl.ac.uk · PDF fileBig Data: Law, Regulation & Ethics...

15
Big Data: Law, Regulation & Ethics Professor Karen Yeung, Director, Centre for Technology, Ethics Law & Society (TELOS) The Dickson Poon School of Law

Transcript of Big Data: Law, Regulation & Ethics - kcl.ac.uk · PDF fileBig Data: Law, Regulation & Ethics...

Big Data: Law, Regulation & EthicsProfessor Karen Yeung, Director, Centre for Technology, Ethics Law & Society (TELOS)The Dickson Poon School of Law

Centre for Technology, Ethics, Law & Society

• TELOS established in 2007 within the Law School

• Research concerned with exploring the legal, regulatory and ethical dimensions of emerging technology

• Cross-cutting, interdisciplinary research agenda: thus, our focus is not specific to a particular type of technology, policy sector or field of law

• http://www.kcl.ac.uk/law/research/centres/telos/index.aspx

B. Legal, democratic and ethical dimensions of Big Data

• General

• Domain specific

• Public Policy (‘impact’) activities

C. Big data technologies and their implications for lawMr Perry Keller ([email protected])

Data privacy in relation to governmental and commercial surveillance in urban environments

Given the diverse technologies involved in the production, collection and utilisation of personal data, this project has developed along several lines, including

• biometric identification, • 'internet of things', • cloud computing,• data analytics and • artificial intelligence

Project focus: privacy and data protection, in terms of:

• the agency of data subjects or • the governance of personal data processing.

First working paper will be posted on SSRN shortly, two more to follow

IP law implications for Big DataProf. Tanya Aplin ([email protected])

How do IP rights create inhibit the Big Data revolution?

(a) Copyright as an obstacle to data mining: Access to data sets, and application of data mining techniques to available data sets, may be obstructed by existing copyright law and database rights.Fragmented and technical nature of IP rights often makes it difficult to identify the right holder and/or scope of rights in particular works/data sets

• Digital content publishers want to charge those who wish to mine their data • But strong opposition from those wishing to mine, on the basis that they have already paid for

access rights • Recent legal reforms (ie Text and Data mining exception in copyright law & proposed EU directive

on digital single market) erodes opportunity to monetise, although exception only for non-commercial uses (eg academic)

(b) Patents and data mining software: Debate about whether data mining software tools themselves can be patented. Concerns that if they are patentable, this confers a powerful monopoly over data mining techniques that might impede the development of big data

Big Data Governance: Law, ethics & regulationProf. Karen Yeung ([email protected])

Algorithmic accountability

General research interest: governance of emerging technologies

In relation to Big Data, my core concern focuses on the need for ‘algorithmic accountability’

Premise: the turn to data-driven decision-making and decision-support, powered by Big Data, entails the rise of ‘algorithmic power’.

For lawyers, wherever power lies, there lies the potential for misuse and abuse.

Hence, several completed and on-going projects

Algorithmic manipulation of choice environmentsTwo primary ways in which Big Data decision-making technologies operate:• automated decision-making systems or • offer digital ‘guidance’ to human decision-makers (eg general search engines, product recommendation engines)

I argue that digital guidance techniques constitute as a ‘soft’ form of design-based regulation which rely upon the use of ‘nudge’

Nudge = a particular form or aspect of choice architecture that alters people’s behaviour in a predictable way without forbidding any options or significantly changing their economic incentives (Thaler & Sunstein 2008). Big Data nudges constitute a particularly powerful and potent form of control owing to their networked, continuously updated, dynamic and pervasive nature and which I therefore refer to as ‘hypernudging’

I examine the manipulative nature of these techniques, drawing on critiques from : • liberal theory, and • science and technology studies (STS). and suggest that these legitimacy concerns cannot be resolved by reliance on user notice and consent

Conclude: troubling implications for democracy and human flourishing if we fail to establish and maintain effective and legitimate constraints on the use of Big Data analytic techniques driven by commercial self-interest

‘Yeung, Karen ‘Hypernudge’: Big Data as a Mode of Regulation by Design." Information, Communication & Society (2017) Volume 20, Issue 1 (Special Issue on the Social Power of Algorithms)

Big data governanceAim: provide a critical evaluation of alternative models of data governance – but realised that it was premature: that we are only just beginning to try and wrestle with the challenges of data governance in an era of continuous, networked ubiquitous computing.

Existing legal literature enormously frustrating, because it typically takes the ‘fair information principles’ upon which contemporary data protection laws are based as its starting point

But, as a regulatory governance scholar – this is not terribly helpful. Rather, more fruitful to begin with identifying and understanding the overarching social goals that the governance of personal data seeks to achieve. Better to proceed by: First,

(a) identifying the risks/harms associated with algorithmic processing of personal data

(b) Clarify the benefits that such processing offers (these need to be proved in specific contexts and, where possible, an attempt to quantify – ie not just empty rhetoric)

(c) identifying how those risks-benefits are distributed across society (social justice)

Second, identify the core normative values implicated in the algorithmic processing of personal data, and consider those values in light of the benefits and risks (and directly confront inevitable value trade-offs)

So, my general approach has spawned a set of more specific projects:

(a) Understanding algorithmic harm:

Starting point: growing recognition that algorithms, when used to make or drive decision-making, may have serious adverse impact on individuals (Bill Davidow’s ‘algorithmic prisons’ – The Atlantic 2014)

Aim: to acquire a clearer understanding of the nature of algorithmic power, and the potential harms that such power may generate.

Constructs an analytic framework for understanding the harms associated with algorithmic decision-making systems, as a first step towards identifying what kind of robust, legitimate and effective regulatory governance mechanisms are needed to secure meaningful algorithmic accountability.

Status: writing up stage

(b) Algorithmic regulation: A critical examinationAim: to understand the phenomenon of ‘algorithmic regulation’ as a form of social ordering

Offers a taxonomy of different forms of algorithmic regulation (identifying 8 different forms)

Analyses algorithmic regulation primarily as a form of social ordering, adopting several analytical lenses, primarily:

• regulatory governance studies• surveillance studies and political economy• legal critiques of algorithmic decision-making

Status: manuscript currently under peer-review

D. Domain specific applications

1. Big Data and the digital consumer: online personalised pricing

Dr Chris Townley ([email protected]), Prof Karen Yeung & Mr Eric Morrison (FCA)

Big data analytic techniques raises the possibility of ‘personalised’ pricing – when the prices offered for the same goods vary depending upon the user’s algorithmically created digital profile based on detailed tracking of the individual’s on-line behaviour and preferences

In 2000, Amazon.com sold DVDs to different people for different prices. Public outcry prompted Amazon to claimed it was merely a test and refunded the price difference to those who paid more

In 2012, the Wall Street Journal investigation found that the Staples Inc. website displays different prices to people after estimating their locations. If the user lived within 20 miles of a rival store (either OfficeMax Inc. or Office Depot Inc). Staples.com usually showed a discounted price.

Also, travel site Orbitz reported by WSJ to offer higher prices to customers (and higher end hotels) to those who visited their site with a Mac computer rather than a PC, based on assumption that because Apple Macs are more expensive than PCs, so Mac users probably wealthier

Aim: critical evaluation of big data driven price discrimination primarily through the lens of EU competition law and policy, from the perspective of:

• Economic efficiency: does online personalised pricing enhance or diminish aggregate welfare, conventionally understood in terms of economic efficiency?

• equity or fairness: is online personalised pricing fair? Here we draw on insights from social psychology and marketing studies concerned with investigating social perceptions of fairness in pricing, as well as a rights-based conceptions of ‘unfair dealing’ and considerations of social justice including its potential to discriminate against historically marginalised groups

Status: draft complete, eliciting feedback from colleagues prior to submission to peer-review journal

AI, the law and the legal profession Prof Karen Yeung and Ms Kristina Caka (King’s undergraduate research fellow)

Aim: to explore the implications of the on-going ‘AI boom’ for the future of the legal profession, and for the practice and understanding of the law

1. Seeks to identify the way in which AI tools have been developed to assist in the provision of legal services, and those that are currently under development

2. Constructs a typology of the different tools, and seeks to identify the various drivers that are motivating their use and development

3. Considers the implications of the turn to AI tools in the provision of legal services for the nature, character and understanding of law itself

Status: basic research complete. Constructing analytic framework

AI in healthcare: regulation of clinical decision support systems

Prof Karen Yeung

Aim: critically examine the promises and risks associated with the turn to AI and predictive analytic techniques that utilise massive volumes of health related data to aid clinical decision-making generates.

Argues that:

1. These systems create new sources of risks which may threaten patient safety, but which have largely been overlooked in the hype currently surrounding the big data phenomenon

2. In particular, the highly sophisticated nature of these big data-driven health technologies means that providing the oversight and governance mechanisms needed to safeguard against these risks esp. challenging

3. That conventional medical device regulation is not up to the task of adequately regulating those risks

4. We need new, dynamic, flexible and robust regulatory oversight and governance mechanisms that ensures multi-stakeholder involvement and testing at all stages of the development, testing, implementation and post-implementation surveillance of these technologies in order to provide the necessary assurance that they meet acceptable standards of clinical quality, safety and social interoperability.

Status: first draft complete, under revision in light of colleagues’ feedback (stalled)

Data analytics in crime control & prevention

Prof Karen Yeung, Prof Ben Bowling ([email protected]) & Ms Shruti Iyengar

Aim: to explore how big data analytics are being used for, and in support of, decision-making in the realm of crime monitoring, investigation and enforcement.

Qn: how is data is used to produce ‘actionable insight’ which is then translated into decisions in the criminal justice system that affect the rights, interests and expectations of individuals in ways that may adversely affect them?

We propose to examine the monitoring and enforcement technologies employed in the following contexts:

1. street crime, particularly the use of body-worn cameras by police officers;2. road traffic offences, and the range of technologies employed to monitor of motor vehicles and

enforcement of traffic regulations;3. tax evasion and social security fraud (in which the former predominantly understood as a crimes of the

rich, with the latter understood as largely crimes of the poor)

Status: basic research stage

Public Policy and ‘impact’ activities

Prof Karen YeungBritish Academy - Royal Society Project on Data Governancehttps://royalsociety.org/topics-policy/projects/data-governance/

Prof Roger Brownsword ([email protected])Royal Society – Machine Learning projecthttps://royalsociety.org/topics-policy/projects/machine-learning/