Friday, March 07, 2025

From ChatGPT to Anxiety

There has been a lot of attention to the ways in which algorithms may either promote or alleviate anxiety. A recent article in the journal Subjectivity with the anxiety-laden title Left to their own devices looked at mental health apps, and the journal editors were kind enough to publish my commentary on this article exploring the subjectivity of devices.

In addition to understanding how humans may experience their interactions with algorithms, we can also ask how the algorithm experiences these interactions. This experience can be analysed not only at the cognitive level but also at the affective level. It turns out that if a lot of stressful material is loaded into ChatGPT, this causes the algorithm to produce what looks like an anxious response.

The training for human therapists typically includes developing the ability to contain this kind of anxiety and to safely unload it later. Whether and how mental health apps can develop this kind of ability is currently an open question, with important ethical implications. Meanwhile, there are some promising indications that an anxious chatbot response may be calmed by giving mindfulness exercises to the chatbot.

This certainly puts a new twist on the topic of the subjectivity of devices.

 


 

Ziv Ben-Zion et al, Assessing and alleviating state anxiety in large language models (npj Digital Medicine 8/132, 2025)

Kyle Chayka, The Age of Algorithmic Anxiety (New Yorker, July 25 2022)

Jesse Ruse, Ernst Schraube and Paul Rhodes, Left to their own devices: The significance of mental health apps on the construction of therapy and care. (Subjectivity 2024)

Richard Veryard, On the Subjectivity of Devices (Subjectivity 2024) available here https://rdcu.be/d8PSt

Brandon Vigliarolo, Maybe cancel that ChatGPT therapy session – doesn't respond well to tales of trauma (The Register, 5 Mar 2025)


No comments:

Post a Comment