As followers of the Parsifal legend will know, at a critical point in the story Parsifal fails to ask the one question that matters: "Whom does the Grail serve?"
And anyone who wishes to hype chatbots as some kind of "holy grail" must also ask the same question: "Whom does the Chatbot serve?" IBM puts this at the top of its list of ethical questions for chatbots, as does @ashevat (formerly with Slack).
To the extent that a chatbot is providing information and advice, it is subject to many of the same ethical considerations as any other information source - is the information complete, truthful and unbiased, or does it serve the information provider's commercial interest? Perhaps the chatbot (or rather its owner) is getting a commission if you eat at the recommended restaurant, just as hotel concierges have always done. A restaurant review in an online or traditional newspaper may appear to be independent, but restaurants have many ways of rewarding favourable reviews even without cash changing hands. You might think it is ethical for this to be transparent.
But an important difference between a chatbot and a newspaper article is that the chatbot has a greater ability to respond to the particular concerns and vulnerabilities of the user. Shiva Bhaska discusses how this power can be used for manipulation and even intimidation. And making sure the user knows that they are talking to a bot rather than a human does not guard against an emotional reaction: Joseph Weizenbaum was one of the first in the modern era to recognize this.
One area where particularly careful ethical scrutiny is required is the use of chatbots for mental health support. Obviously there are concerns about efficacy and safety as well as privacy, and such systems need to undergo clinical trials for efficacy and potential adverse outcomes, just like any other medical intervention. Kira Kretzschmar et al argue that it is also essential that these platforms are specifically programmed to discourage over-reliance, and that users are encouraged to seek human support in the case of an emergency.
Another ethical problem with chatbots is related to the Weasley doctrine (named after Arthur Weasley in Harry Potter and the Chamber of Secrets):
"Never trust anything that can think for itself if you can't see where it keeps its brain."Many people have installed these curious cylindrical devices in their homes, but is that where the intelligence is actually located? When a private conversation was accidentally transmitted from Portland to Seattle, engineers at Amazon were able to inspect the logs, coming up with a somewhat implausible explanation as to how this might have occurred. Obviously this implies a lack of boundaries between the device and the manufacturer. And as @geoffreyfowler reports, chatbots don't only send recordings of your voice back to Master Control, they also send status reports from all your other connected devices.
Smart home, huh? Smart for whom? Transparency for whom? Or to put it another way, whom does the chatbot serve?
Shiva Bhaskar, The Chatbots That Will Manipulate Us (30 June 2017)
Geoffrey A. Fowler, Alexa has been eavesdropping on you this whole time (Washington Post, 6 May 2019) HT@hypervisible
Sidney Fussell, Behind Every Robot Is a Human (The Atlantic, 15 April 2019)
Tim Harford, Can a computer fool you into thinking it is human? (BBC 25 September 2019)
Gary Horcher, Woman says her Amazon device recorded private conversation, sent it out to random contact (25 May 2018)
Kira Kretzschmar et al, Can Your Phone Be Your Therapist? Young People’s Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support (Biomed Inform Insights, 11, 5 March 2019)
Trips Reddy, The code of ethics for AI and chatbots that every brand should follow (IBM 15 October 15, 2017)
Amir Shevat, Hard questions about bot ethics (Slack Platform Blog, 12 October 2016)
Tom Warren, Amazon explains how Alexa recorded a private conversation and sent it to another user (The Verge, 24 May 2018)
Joseph Weizenbaum, Computer Power and Human Reason (WH Freeman, 1976)
Related post: Whom does the technology serve? (May 2019), The Road Less Travelled (June 2019)
updated 4 October 2019