IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

The Functional Morality of Robots

The Functional Morality of Robots
View Sample PDF
Author(s): Linda Johansson (Royal Institute of Technology, Sweden)
Copyright: 2012
Pages: 9
Source title: Ethical Impact of Technological Advancements and Applications in Society
Source Author(s)/Editor(s): Rocci Luppicini (University of Ottawa, Canada)
DOI: 10.4018/978-1-4666-1773-5.ch020

Purchase

View The Functional Morality of Robots on the publisher's website for pricing and purchasing information.

Abstract

It is often argued that a robot cannot be held morally responsible for its actions. The author suggests that one should use the same criteria for robots as for humans, regarding the ascription of moral responsibility. When deciding whether humans are moral agents one should look at their behaviour and listen to the reasons they give for their judgments in order to determine that they understood the situation properly. The author suggests that this should be done for robots as well. In this regard, if a robot passes a moral version of the Turing Test—a Moral Turing Test (MTT) we should hold the robot morally responsible for its actions. This is supported by the impossibility of deciding who actually has (semantic or only syntactic) understanding of a moral situation, and by two examples: the transferring of a human mind into a computer, and aliens who actually are robots.

Related Content

Robin Throne, Michalina Hendon. © 2024. 19 pages.
Fatmanur Özen, Aytekin Demircioğlu. © 2024. 22 pages.
Kübra Kırca Demirbaga. © 2024. 17 pages.
Gulcin Karadeniz. © 2024. 15 pages.
Glenn Dawes. © 2024. 20 pages.
Aylin Akinlar. © 2024. 29 pages.
Ziaul Islam Jewel. © 2024. 15 pages.
Body Bottom