This paper tries to understand the phenomenon that humans are able to empathize with robots and the intuition that there might be something wrong with “abusing” robots by discussing the question regarding the moral standing of robots. After a review of some relevant work in empirical psychology and a discussion of the ethics of empathizing with robots, a philosophical argument concerning the moral standing of robots is made that questions distant and uncritical moral reasoning about entities’ properties and that recommends first trying to understand the issue by means of philosophical and artistic work that shows how ethics is always relational and historical, and that highlights the importance of language and appearance in moral reasoning and moral psychology. It is concluded that attention to relationality and to verbal and non-verbal languages of suffering is key to understand the phenomenon under investigation, and that in robot ethics we need less certainty and more caution and patience when it comes to thinking about moral standing.
Bryson, Joanna. 2010. Robots Should Be Slaves. In Y. Wilks (Ed.), Close engagements with artificial companions: Key social, psychological, ethical and design issues (pp. 63–74). Amsterdam: John Benjamins.
Coeckelbergh, Mark. 2010a. Moral Appearances: Emotions, Robots, and Human Morality. Ethics and Information Technology 12(3): 235–241.
Coeckelbergh, Mark. 2010b. Robot Rights? Towards a Social-Relational Justification of Moral Consideration. Ethics and Information Technology 12(3): 209–221.
Coeckelbergh, Mark. 2011a. Humans, Animals, and Robots: A Phenomenological Approach to Human-Robot Relations. Philosophy & Technology 24(3): 269–278.
Coeckelbergh, Mark. 2011b. You, Robot: On the Linguistic Construction of Artificial Others. AI & Society 26(1): 61–69.
Coeckelbergh, Mark. 2012. Growing Moral Relations: Critique of Moral Status Ascription. Basingstoke and New York: Palgrave Macmillan.
Coeckelbergh, Mark. 2014. The Moral Standing of Machines: Towards a Relational and Non-Cartesian Moral Hermeneutics. Philosophy & Technology 27(1): 61–77.
Coeckelbergh, Mark. 2017. Using Words and Things: Language and Philosophy of Technology. New York and Abingdon: Routledge.
Coeckelbergh, Mark and Gunkel, David. 2014. Facing Animals: A Relational, Other-Oriented Approach to Moral Standing. Journal of Agricultural and Environmental Ethics 27(5): 715–733.
Darling, Kate. 2017. ‘Who’s Johnny?’ Anthropomorphic Framing in Human-Robot Interaction, Integration, and Policy. Robot Ethics S 2.0, eds. P. Lin, G. Bekey, K. Abney, R. Jenkins, Oxford University Press.
Floridi, Luciano and Sanders, J.W. 2004. On the Morality of Artificial Agents. Minds and Machines 14(3): 349–379.
Gunkel, David. 2012. The machine question: Critical perspectives on AI, robots, and ethics. Cambridge, MA: MIT Press.
Gunkel, David. 2017. The Other Question: Can and Should Robots Have Rights? Ethics and Information Technology (online first). DOI 10.1007/s10676–017–9442–4.
Johnson, Deborah G. 2006. Computer systems: Moral entities but not moral agents. Ethics and Information Technology, 8(4): 195–204.
Kant, Immanuel. 1997. Lectures on Ethics, eds. P. Heath and J.B. Schneewind. Trans. P. Heath. Cambridge: Cambridge University Press.
Kant, Immanuel. 2012. Lectures on Anthropology, eds. A.W. Wood and R.B. Louden. Cambridge: Cambridge University Press.
Rosenthal-von der Pütten, Astrid M., Krämer, Nicole C., Hoffmann, Laura, Sobieray, Sabrina, and Sabrina C. Eimler. 2013. An Experimental Study on Emotional Reactions Towards a Robot. International Journal of Social Robotics. 5: 17–34.
Searle, John R. 1995. The Construction of Social Reality. London: The Penguin Press.
Suzuki et al. 2015. Measuring Empathy for Human and Robot Hand Pain Using Electroencephalography. Nature, Scientific Reports 5.
Whitby, Blay. 2008. Sometimes It’s Hard to Be a Robot: A Call for Action on the Ethics of Abusing Artificial Agents. Interaction with Computers 20: 338–341.
Wittgenstein, Ludwig. 1953. Philosophical Investigations, eds and trans: Hacker PMS, Schulte J. Oxford: Wiley-Blackwell, 2009.