Social and ethics lessons from operating Military Drones and Robots

download Social and ethics lessons from operating Military Drones and Robots

of 1

Transcript of Social and ethics lessons from operating Military Drones and Robots

  • 8/13/2019 Social and ethics lessons from operating Military Drones and Robots

    1/1

    The specific insight is that even in a highly digitised world shielding operators from

    direct physical action, psychological consequences still exist. It means that the very

    idea of a robotic or artificial intelligence fulfilling our duties unsettles. Whilst the

    technology is advancing at fast pace, the moral and ethical burdens it carries have

    been largely unconsidered. As in many domains, the military is at the vanguard of

    future civilian issues. Scientists, legal experts and philosophers are now joining forces

    to scrutinise the promise of intelligent systems and wrangle over their implications:

    after wining chess games, IBM's supercomputer Watson will soon diagnose diseases,

    and also be used by health Insurers to assess customers. But what are the implications for businesses andcustomers when they want to argue decisions made by "black boxes"? The ethics of this emerging paradigm

    experienced in the extreme environment of the military will ultimately need to be addressed by civilian

    organisations.

    @zeronomics

    The battlefield in the Middle-East teaches important lessons on the

    virtualisation of real-life situation: on the social bonds that humans develop

    with machines, and on the other end of the spectrum on the impossibility to

    totally 'virtualise' an experience: the link with the physical world always

    comes back to remind of its reality.

    New research from the University of Washington explores social bonds

    soldiers develop with their 'bomb disposal' robots in Iraq and Afghanistan.

    They often anthropomorphize the machines that help keep them alive,

    assigning those human attributes, and even displaying empathy towardthem. To the point of refusing replacement robots unless it is 'their' machine

    being repaired or holding funerals for their fallen brothers in robotic arms.

    It recently happened in Iraq: the tribute involved a 21-gun salute, and the

    awarding of a Purple Heart and a Bronze Star Medal. 'He' was a MARCbot, a

    R2D2-like robot designed to disarm explosives.

    The other side of this "virtualisation" paradigm is the issue impacting Drone*

    Operators (* technically un-manned machines guided by operators, as

    opposed to fully automated robots) diagnosed with post-traumatic stress

    disorder despite not physically facing the battlefield. This runs counter-intuitive to the initial concept of drone warfare, which assumed that

    combat's devastating psychological effects had been mitigated. Instead,

    moral injury" is now an accepted issue, which represents a shift from the

    violence done to people toward their feelings about what they have done to

    others. Yet, 61% of Americans in the latest Pew survey support military

    drones because they wont risk US lives. Unfortunately there are serious

    impacts: a 2011 survey shows 42% of operators reported moderate to high

    stress, and 20% reported emotional exhaustion or burnout.

    https://twitter.com/zeronomicshttps://twitter.com/zeronomicshttps://twitter.com/zeronomicshttps://twitter.com/zeronomicshttps://twitter.com/zeronomicshttps://twitter.com/zeronomicshttp://strategicrisks.org/