What if doctors could objectively measure your mental state?Dr el-Kaliouby is one of the pioneers of affective computing, and is founder of a company called Affectiva. Some of her early work was building apps that helped autistic people to read expressions. She now argues that
artificial emotional intelligence is key to building reciprocal trust between humans and AI.
Affectiva competes with some of the big tech companies (including Amazon, IBM and Microsoft), which now offer
emotional analysisor
sentiment analysisalongside facial recognition.
One proposed use of this technology is in the classroom. The idea is to install a webcam in the classroom: the system watches the students, monitors their emotional state, and gives feedback to the teacher in order to maximize student engagement. (For example, Mark Lieberman reports a university trial in Minnesota, based on the Microsoft system. Lieberman includes some sceptical voices in his report, and the trial is discussed further in the 2018 AI Now report.)
So how do such systems work? The computer is trained to recognize a
happyface by being shown large numbers of images of happy faces. This depends on a team of human coders labelling the images.
And this coding generally relies on a
classicaltheory of emotions. Much of this work is credited to a research psychologist called Paul Ekman, who developed a Facial Action Coding System (FACS). Most of these programs use a version called EMFACS, which detects six or seven supposedly universal emotions: anger, contempt, disgust, fear, happiness, sadness and surprise. The idea is that because these emotions are
hardwired, they can be detected by observing facial muscle movements.
Lisa Feldman Barrett, one of the leading critics of the classical theory, argues that emotions are more complicated, and are a product of one's upbringing and environment.
Emotions are real, but not in the objective sense that molecules or neurons are real. They are real in the same sense that money is real – that is, hardly an illusion, but a product of human agreement.
It has also been observed that people from different parts of the world, or from different ethnic groups, express emotions differently. (Who knew?) Algorithms that fail to deal with ethnic diversity may be grossly inaccurate and set people up for racial discrimination. For example, in a recent study of two facial recognition software products, one product consistently interpreted black sportsmen as angrier than white sportsmen, while the other labelled the black subjects as contemptuous.
But Affectiva prides itself on dealing with ethnic diversity. When Rana el-Kaliouby spoke to Oscar Schwartz recently, while acknowledging that the technology is not foolproof, she insisted on the importance of collecting
diverse data setsin order to compile
ethnically based benchmarks ... codified assumptions about how an emotion is expressed within different ethnic cultures. In her most recent video, she also insisted on the importance of diversity of the team building these systems.
Shoshana Zuboff describes sentiment analysis as yet another example of the behavioural surplus that helps Big Tech accumulate what she calls surveillance capital.
Zuboff relies heavily on a long interview with el-Kaliouby in the New Yorker in 2015, where she expressed optimism about the potential of this technology, not only to read emotions but to affect them.Your unconscious - where feelings form before there are words to express them - must be recast as simply one more sources of raw-material supply for machine rendition and analysis, all of it for the sake of more-perfect prediction. ... This complex of machine intelligence is trained to isolate, capture, and render the most subtle and intimate behaviors, from an inadvertent blink to a jaw that slackens in surprise for a fraction of a second.Zuboff 2019, pp 282-3
In her talk last month, without explicitly mentioning Zuboff's book, el-Kaliouby put a strong emphasis on the ethical values of Affectiva, explaining that they have turned down offers of funding from security, surveillance and lie detection, to concentrate on such areas as safety and mental health. I wonder if IBM ("Principles for the Cognitive Era") and Microsoft ("The Future Computed: Artificial Intelligence and its Role in Society") will take the same position?I do believe that if we have information about your emotional experiences we can help you be in a more positive mood and influence your wellness.
HT @scarschwartz @raffiwriter
AI Now Report 2018 (AI Now Institute, December 2018)
Bernd Bösel and Serjoscha Wiemer (eds), Affective Transformations: Politics—Algorithms—Media (Meson Press, 2020)
Hannah Devlin, AI systems claiming to 'read' emotions pose discrimination risks (Guardian,16 February 2020)
Rana el-Kaliouby, Teaching Machines to Feel (Bloomberg via YouTube, 20 Sep 2017), Emotional Intelligence (New York Times via YouTube, 6 Mar 2019)
Lisa Feldman Barrett, Psychological Construction: The Darwinian Approach to the Science of Emotion (Emotion Review Vol. 5, No. 4, October 2013) pp 379 –389
Douglas Heaven, Why faces don't always tell the truth about feelings (Nature, 26 February 2020)
Raffi Khatchadourian, We Know How You Feel (New Yorker, 19 January 2015)
Mark Lieberman, Sentiment Analysis Allows Instructors to Shape Course Content around Students’ Emotions, Inside Higher Education , February 20, 2018,
Lauren Rhue, Racial Influence on Automated Perceptions of Emotions (November 9, 2018) http://dx.doi.org/10.2139/ssrn.3281765
Oscar Schwartz, Don’t look now: why you should be worried about machines reading your emotions (The Guardian, 6 Mar 2019)
Shoshana Zuboff, The Age of Surveillance Capitalism (UK Edition: Profile Books, 2019)
Wikipedia: Facial Action Coding System
Related posts: Linking Facial Expressions (September 2009), Data and Intelligence Principles from Major Players (June 2018), Shoshana Zuboff on Surveillance Capitalism (February 2019), Listening for Trouble (June 2019)
Links added February 2020
No comments:
Post a Comment