Of Souls, Spirits and Ghosts: Transposing the Application of the Rules of Targeting to Lethal Autonomous Robots

Research output: Contribution to journalArticleScientificpeer-review

Abstract

The article addresses how the rules of targeting regulate lethal autonomous robots. Since the rules of targeting are addressed to human decision-makers, there is a need for clarification of what qualities lethal autonomous robots would need to possess in order to approximate human decision-making and to apply these rules to battlefield scenarios. The article additionally analyses state practice in order to propose how the degree of certainty required by the principle
of distinction may be translated into a numerical value. The reliability rate with which lethal autonomous robots need to function is identified. The article then analyses whether the employment of three categories of robots complies with the rules of targeting. The first category covers robots which work on a fixed algorithm. The second category pertains to robots that have artificial intelligence and that learn from the experience of being exposed to battlefield scenarios. The third category relates to robots that emulate the working of a human brain.
Original languageEnglish
Pages (from-to)2-58
Number of pages56
JournalMelbourne Journal of International Law
Volume16
Issue number1
Publication statusPublished - 1 Jun 2015
Externally publishedYes

Fingerprint

Robots
Artificial intelligence
Brain
Decision making

Keywords

  • lethal autonomous robots, rules of targeting, international humanitarian law

Cite this

@article{9e142b8a5b4548c4891eabed6052cdce,
title = "Of Souls, Spirits and Ghosts: Transposing the Application of the Rules of Targeting to Lethal Autonomous Robots",
abstract = "The article addresses how the rules of targeting regulate lethal autonomous robots. Since the rules of targeting are addressed to human decision-makers, there is a need for clarification of what qualities lethal autonomous robots would need to possess in order to approximate human decision-making and to apply these rules to battlefield scenarios. The article additionally analyses state practice in order to propose how the degree of certainty required by the principleof distinction may be translated into a numerical value. The reliability rate with which lethal autonomous robots need to function is identified. The article then analyses whether the employment of three categories of robots complies with the rules of targeting. The first category covers robots which work on a fixed algorithm. The second category pertains to robots that have artificial intelligence and that learn from the experience of being exposed to battlefield scenarios. The third category relates to robots that emulate the working of a human brain.",
keywords = "lethal autonomous robots, rules of targeting, international humanitarian law",
author = "Tetyana Krupiy",
year = "2015",
month = "6",
day = "1",
language = "English",
volume = "16",
pages = "2--58",
journal = "Melbourne Journal of International Law",
number = "1",

}

Of Souls, Spirits and Ghosts: Transposing the Application of the Rules of Targeting to Lethal Autonomous Robots. / Krupiy, Tetyana.

In: Melbourne Journal of International Law, Vol. 16, No. 1, 01.06.2015, p. 2-58.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - Of Souls, Spirits and Ghosts: Transposing the Application of the Rules of Targeting to Lethal Autonomous Robots

AU - Krupiy, Tetyana

PY - 2015/6/1

Y1 - 2015/6/1

N2 - The article addresses how the rules of targeting regulate lethal autonomous robots. Since the rules of targeting are addressed to human decision-makers, there is a need for clarification of what qualities lethal autonomous robots would need to possess in order to approximate human decision-making and to apply these rules to battlefield scenarios. The article additionally analyses state practice in order to propose how the degree of certainty required by the principleof distinction may be translated into a numerical value. The reliability rate with which lethal autonomous robots need to function is identified. The article then analyses whether the employment of three categories of robots complies with the rules of targeting. The first category covers robots which work on a fixed algorithm. The second category pertains to robots that have artificial intelligence and that learn from the experience of being exposed to battlefield scenarios. The third category relates to robots that emulate the working of a human brain.

AB - The article addresses how the rules of targeting regulate lethal autonomous robots. Since the rules of targeting are addressed to human decision-makers, there is a need for clarification of what qualities lethal autonomous robots would need to possess in order to approximate human decision-making and to apply these rules to battlefield scenarios. The article additionally analyses state practice in order to propose how the degree of certainty required by the principleof distinction may be translated into a numerical value. The reliability rate with which lethal autonomous robots need to function is identified. The article then analyses whether the employment of three categories of robots complies with the rules of targeting. The first category covers robots which work on a fixed algorithm. The second category pertains to robots that have artificial intelligence and that learn from the experience of being exposed to battlefield scenarios. The third category relates to robots that emulate the working of a human brain.

KW - lethal autonomous robots, rules of targeting, international humanitarian law

M3 - Article

VL - 16

SP - 2

EP - 58

JO - Melbourne Journal of International Law

JF - Melbourne Journal of International Law

IS - 1

ER -