Koops em last jd nov 2013 mutual shaping ltns
-
Upload
tilburg-institute-for-law-technology-and-society -
Category
Documents
-
view
218 -
download
1
description
Transcript of Koops em last jd nov 2013 mutual shaping ltns
The mutual shaping of law, technology,
norms, and society
The example of robotics
Prof. Bert-Jaap Koops TILT - Tilburg Institute for Law, Technology, and Society
[email protected] http://www.robolaw.eu
10 Dimensions of Technology Regulation
2
source: Koops, B.J. (2010), ‘Ten dimensions of technology regulation. Finding your bearings in the research space of an emerging discipline’, in: M.E.A. Goodwin et al. (eds), Dimensions of Technology Regulation, Nijmegen: WLP, p. 309-324, http://ssrn.com/abstract=1633985
tool-box • law • social norms • market • architecture
regulatory pitch • command-and-
control • soft sister • pragmatic / rational
regulatory range • negative (stick) • positive (carrot) • neutral
appearance bindingness • fundamental rights • statutory legislation • case-law • soft law • non-binding rule legal tradition • common law • civil law actor • legislator • public executive body
(quango) • NGO • standardisation body • business • consumer organisation • patient organisation • public-private partnership
5. Regulation Type
Koops-Dimensions of Technology Regulation
3
• constitutional rule • statutory rule (civil law) • precedent (common law) • contractual term • General Terms & Conditions • code of practice/conduct
legal area • constitutional law • public international law • criminal law • administrative law • environmental law • contract law • tort law • intellectual-property law • private international law • labour law • disability law • health law • etc etc
ethical school • utilitarianism
• Kantianism
• communitarianism
• ethics of the good life
human rights • privacy
• non-discrimination
• freedom of thought
• right to health
• etc
fundamental values • autonomy • liberty • accountability / responsibility • privacy • human dignity • equality • etc
fundamental concepts • property • personhood • integrity • etc
risk attitude • risk-averse • risk-tolerant • uncertainty tolerance
6. Normative Outlook
Koops-Dimensions of Technology Regulation
4
regulation purpose • protection of vulnerable people
• ordering socio-economic relations
• providing legal certainty
• incentivising innovation
• distributing responsibilities
problem definition • descriptive
• normative
• exploratory
• hypothesis testing
research aim • enhancing understanding
• solving a practical problem
• solving a theoretical problem
• feeding policy
• guiding policy
research methods • desk research
• normative legal analysis
• comparative legal research
• case-law survey
• interviews
• case studies
• survey
9. Problem
Koops-Dimensions of Technology Regulation
5
Co-evolution of technology and society
6 November 2013 6
Technology
Society
The role of the law
6 November 2013 7
Law
Society Technology
The TILT mutual shaping perspective
6 November 2013 8
Regula3on
Norma3ve outlooks
Technology developments
The interplay between regulation, technology, and normative outlooks [from: TILT Research Programme 2009-2013, v. 1.01]
The TILTed mutual shaping perspective
6 November 2013 9
Technology developments
Norma3ve outlooks Regula3on
The interplay between regulation, technology, and normative outlooks [adapted from: TILT Research Programme 2009-2013, v. 1.01]
society
The TILT(s) mutual shaping perspective
6 November 2013 10
Technology developments
Norma3ve outlooks Regula3on
The interplay between regulation, technology, and normative outlooks [adapted from: TILT Research Programme 2009-2013, v. 1.01]
How to apply this TILT triangle?
An example: robotics and human rights
6 November 2013 11
6 November 2013 12
Robo3cs
Human-‐rights theory Norma3ve assump3ons on humans and rights
Human-‐rights law
1. Application level
6 November 2013
Figure 1. The application level of human-rights implications of robotics
Robo3cs
Human-‐rights theory Norma3ve assump3ons on humans and rights
2 Human-‐rights law
13
2. Assessment level
6 November 2013
Figure 2. The assessment level of human-rights implications of robotics
Robotics
Human-rights theory Normative
assumptions on humans and rights
Human-rights law
2a 2b
14
3. Reflection level
6 November 2013
Figure 3. The reflection level of human-rights implications of robotics
Robo3cs
Human-‐rights theory Norma3ve assump3ons on humans and rights
Human-‐rights law
2a 2b
15
Mutual shaping in robolaw
6 November 2013 16
Figure 4. Three levels of human-rights implications of robotics
Robo3cs
Human-‐rights theory Norma3ve assump3ons on humans and rights
Human-‐rights law
2a assess 2b
(1) Examples of questions at the application level
6 November 2013 17
Surveillance drone Visual implant Robotics in general Privacy
Non-discrimination
Human rights in general
(1) Examples of questions at the application level
6 November 2013 18
Surveillance drone Visual implant Robotics in general Privacy Is use of a surveillance
drone compatible with the right to privacy?
Do visual implants make recordings of signals they receive, and if so, is there a legitimate basis for this?
Can robots be designed in a privacy-compliant way?
Non-discrimination
Are surveillance drones used in a discriminatory way, e.g., only monitoring areas in which mostly migrants live?
Can a visually impaired person be rejected for a job if she refuses to take a visual implant?
Is equal access to new robotics applications ensured?
Human rights in general
What are the human-rights implications of surveillance drones under the ECHR?
What are the human-rights implications of visual implants used for non-medical purposes?
How are robotics regulated under the Italian Constitution?
[reminder: 2. Assessment level]
6 November 2013
Figure 2. The assessment level of human-rights implications of robotics
Robotics
Human-rights theory Normative
assumptions on humans and rights
Human-rights law
2a 2b
19
(2a) Examples of questions at the level of assessment of robotics
6 November 2013 20
human rights law & theory
Surveillance drone Robotics in general New robots
Privacy How should surveillance drones be designed to make them privacy-compliant (“privacy by design”)?
Do robot developers have a duty to build in safeguards in robots to prevent privacy violations?
Can physical robots or softbots be developed to stop the gradual erosion of privacy in physical and digital environments?
Non-discrimination
Should use of surveillance drones by employers to monitor employees on factory premises be prohibited or otherwise regulated?
Will the development of carebots lead to discrimination of elderly people who cannot afford to buy human care?
Can robots be developed for use at customs control that make ‘colour-blind’ risk assessments about which passengers should be investigated?
Human rights in general
Under which conditions should governments be allowed to use surveillance drones, in light of human rights?
What are the major threats that robotics pose for the protection of human rights?
Are there opportunities in robotics to better protect human rights?
(2b) Examples of questions at the level of assessment of human rights
6 November 2013 21
human rights Surveillance drone Robotics in general
Privacy Is existing law on making photographs in public places adequate in light of developments in drones?
Is the reasonable expectation of privacy affected by developments in robotics, and if so, how should we evaluate that?
Non-discrimination Should we adapt our interpretation of fair treatment in case surveillance drones allow us to discover new distinctions between groups of people?
Should we worry about non-enhanced humans being discriminated against in favour of enhanced humans?
Human rights in general
Do criminal-procedure law and administrative-procedure law provide sufficient safeguards for human-rights protection if the government starts using surveillance drones on a large scale?
Is the European Charter of Fundamental Rights up-to-date in light of developments in robotics?
New human rights Do we need a constitutional ‘right to be forgotten’ if surveillance drones would allow the ubiquitous tracking and publishing online of individuals’ movements?
Should we introduce a right to non-enhancement, or a right to imperfection, if robotics put pressure on people to enhance themselves?
(3a) Examples of questions at the level of reflection on robotics
6 November 2013 22
robots normative outlooks
Surveillance drone Robotics in general
Normative framework
Does a human-rights analysis of surveillance drones lead to different regulatory outcomes than a utilitarian analysis?
Which robotics inventions fit in an ethics of the good life, and which regulatory implications does this have?
Human-rights theory
Which positive obligations does the state have to regulate use of surveillance drones in horizontal relationships?
Can a duty to embrace “value-sensitive design” in robotics be grounded in human-rights arguments?
Fundamental values: Autonomy Human dignity ... ...
Do surveillance drones have a panoptic effect and if so, does this threaten the autonomy of citizens?
Which guidelines can be given for the development and use of robots so that they are compatible with, and where possible enhance, the value of human dignity?
(3b) Examples of questions at the level of reflection on human rights
6 November 2013 23
robots normative outlooks
Surveillance drone Robotics in general
Normative framework
Is the “security versus liberty” frame fruitful to assess surveillance drones, or should a different normative frame be used?
Can we write a white paper on robotics regulation in Europe solely based on the normative framework of human rights, or should we also embrace utilitarian and communitarian perspectives on robotics?
Human-rights theory Can the horizontal protection of privacy still be adequately regulated through the notion of positive state obligations, if anyone can buy surveillance drones off-the-shelf to spy on neighbours?
Can robots at some point become a bearer of human (or fundamental) rights, and which criteria should we apply to make such an assessment?
Fundamental values: Autonomy Human dignity ... ...
Should we cling to 20th-century interpretations of autonomy if 21st-century society effectively becomes a Panopticon through, inter alia, surveillance drones?
How should we interpret the notion of “integrity of the person” if human-computer interfaces connect human brains to the Internet?
The challenge of mutual shaping (1) • if law, technology, and normative outlooks co-evolve, how
can we discuss regulation of emerging technologies? – how do we know how technology is going to evolve?
• cf. Collingridge dilemma – when should we adapt technology to the law, and when adapt
the law to technology? – how can we make normative judgements for future
generations, who may have different values and outlooks? • we can start with current technologies and short-term futures
– assume normative outlooks are stable – focus on the mutual shaping of law and technology – balance technology neutrality and legal certainty – use Technology Assessment and Privacy/Regulatory Impact
Assessment tools – experiment with technology and/or law
cf. Mark Coeckelbergh, Human Being @ Risk. Enhancement, Technology, and the Evaluation of Vulnerability Transformations (Springer, 2013)
6 November 2013 24
The challenge of mutual shaping (2) how do we deal with the longer-term future? • consider which normative outlooks we desire to be stable
– which fundamental values? which human rights? which interpretation of these?
• use our imagination – design socio-technical scenarios – discuss the normative ‘look and feel’ of these scenarios – (science) fiction is an important tool to trigger our imagination
• ‘smart regulation’ – flexible and self-learning – anticipate, monitor, evaluate, adapt
6 November 2013 25