“Well, according to De Montfort University’s Kathleen Richardson, Professor of Ethics and Culture of Robots and AI, such robots will dehumanize women ‘and will influence the way a new generation of men treat and respond to women….’”*
“Oh! That’s not good. We already have a culture that dehumanizes women—and to a certain extent, men or any version of the human species that finds a category for itself. I guess this isn’t really a new thought, and I probably knew the answer to my question even before I asked it. Here’s that article on Richardson, and I see that she also says ‘human empathy will be eroded….’ That makes sense, but do we know that for sure? Or, is it one of those this-is-the-way-we-believe-the-human-psyche-works principles? You know, a principle based as much on cultural standards as it is on science. Wouldn’t we have to wait to see the influence of a number of human-robot relationships on the psyches of men? Or of women if someone builds male robots for intimate relationships? I see in the article that when Professor Richardson was at the University College London, she did postdoctoral work ‘into the therapeutic uses of robots for children with autism spectrum conditions.’ That makes me think of another….”
“Hold that thought. I think we have a good idea that dehumanizing has long been a human trait or practice, and its expression in art and literature is as old as fertility figurines from thousands of years ago. People didn’t have to wait for Lord of the Flies or science fiction works like Blade Runner or other twentieth- and twenty-first century stories for examples of dehumanizing, objectifying, and commodifying. We’ve been dehumanizing one another since we began to walk upright. Think past and present slavery. Think massacres. Think dictators like Stalin, Pol Pot, Hitler, and too many others to mention. Robots? Just a difference in kind, certainly not in degree. People have always treated other people as though they were as expendable and interchangeable as the parts of cars, parts for which one has no permanent feelings. But all this talk begs another question: Although we seem to have some sense of what it means to objectify and dehumanize a human, do we know what it means to ‘humanize’? Now, what was it you were going to say?”
“I find it interesting that her study of the ‘therapeutic uses of robots for children with autism spectrum conditions’ isn’t itself another example of ‘dehumanizing,’ especially a ‘dehumanizing’ of the autistic children in her study. So, if I understand her point, which she made for a documentary on a local Channel 4 and which she is apparently going to make in a report she is writing for the British Government and for the European parliament, as well as for other political bodies, robots will destroy human empathy but robots are good to instill empathy in autistic children. Am I missing something here?”
“No, you have a good point. She seems to condemn the use of robots for one kind of intimacy but suggests it for another kind. Gosh, this being an empathetic ‘humanizing’ human is so confusing. I wonder whether or not Professor Richardson might consider doing an experiment with some dehumanized adults to see whether or not she could increase empathy through the use of the very robots she claims destroy empathy. Even if she got negative results, she would have some science to back up her assumptions about the effects of robots. Aren’t we running similar, real-time experiments on the dehumanizing effects of smart phones and tablets that enable individuals to text rather than talk face-to-face? I think I would like more information on empathy itself. I see that there are two kinds according to the experts: Situational empathy and dispositional empathy. Would Richardson or any other person truly know when empathy has been turned off or on? Don’t we just rely on what we think about the empathy of children, for example? Aren’t we just interpreting their level of empathy the way we look at a pet dog and assume that it has anthropic feelings? Do we just ask people how they feel about substitute humans (robots)? And if we do ask them, can we rely that they are truly reporting on themselves?”
“Are you empathetic? Willing to take an empathy test like Hogan’s, Mehrabian and Epstein’s, or Davis’s? The tests aren’t hard. They’re noninvasive physically, just a series of questions. And if you accept the definition of empathy given by Mehrabian and Epstein as ‘a vicarious response to the perceived emotional experiences of others,’ would you be able to tell whether or not you or someone else has ‘real’ empathy for a robot-person even after a great date? And if someone did show empathy for a robot, what would that indicate about the person’s relationship, that it was humanizing and not dehumanizing?”
“Wow! You’ve given me something to think about.”
*Channel 4 documentary to feature DMU professor’s call for ban on sex robots. Online at http://www.dmu.ac.uk/research/research-news/2017/november/channel-4-documentary-to-feature-dmu-professors-call-for-ban-on-sex-robots.aspx