When regular hyperbole isn't sufficient, writers often refer to new technologies as The Holy Grail of something or other. As I pointed out in my post on Chatbot Ethics, this has some important ethical implications.
Because in the mediaeval Parsifal legend, at a key moment in the story, our hero fails to ask the critical question: Whom Does the Grail Serve? And when technologists enthuse about the latest inventions, they typically overlook the same question: Whom Does the Technology Serve?
a new article on driverless cars, Dr Ashley Nunes of MIT, argues
that academics have allowed themselves to be distracted by versions of
the Trolley Problem (Whom Shall the Vehicle Kill?), and neglected some much more important ethical questions.
For one thing, Nunes argues that the so-called autonomous vehicles are never going to be
fully autonomous. There will always be ways of controlling cars
remotely, so the idea of a lone robot struggling with some ethical
dilemma is just philosophical science fiction. Last year, he told Jesse Dunietz that he hasn't yet found a safety-critical transport system without real-time human oversight.
And in any case, road safety is never
about one car at a time, it is about deconfliction - which means cars avoiding each other as well as pedestrians. With
human driving, there are multiple deconfliction mechanisms to allow many vehicles to occupy the same space without hitting each other. These include traffic
signals, road markings and other conventions indicating right
of way, signals (including honking and flashing lights) to negotiate between drivers, or for drivers to show that they
are willing to wait for a pedestrian to cross the road in front of them.
Equivalent mechanisms will be required to enable so-called autonomous
vehicles to provide a degree of transparency of intention, and therefore
trust. (See Matthews et al. See also Applin and Fischer). See my post on the Ethics of Interoperability.
But according to Nunes, "the most important
question that we should be asking about this technology" is "Who stands to
gain from its life-saving potential?" Because "if those who most need it don’t have access, whose lives would we actually be saving?"
In other words, Whom Does The Grail Serve?
Sally Applin and Michael Fischer, Applied Agency: Resolving Multiplexed Communication in Automobiles (Adjunct Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '12), October 17–19, 2012, Portsmouth, NH, USA) HT @AnthroPunk
Rachel Coldicutt, Tech ethics, who are they good for? (8 June 2018)
Jesse Dunietz, Despite Advances in Self-Driving Technology, Full Automation Remains Elusive (Undark, 22 November 2018) HT @SafeSelfDrive
Ashley Nunes, Driverless cars: researchers have made a wrong turn (Nature Briefing, 8 May 2019) HT @vdignum @HumanDriving
Milecia Matthews, Girish Chowdhary and Emily Kieson, Intent Communication between Autonomous Vehicles and Pedestrians (2017)
Wikipedia: Trolley Problem
For Whom (November 2006), Towards Chatbot Ethics - Whom Does the Chatbot Serve? (May 2019), Ethics of Interoperability (May 2019), The Road Less Travelled - Whom Does the Algorithm Serve? (June 2019), Jaywalking (November 2019)