Holly Herndon & Mat Dryhurst
The Call, 2024.
Holly Herndon & Mat Dryhurst The Call, exhibition view, Serpentine,
London, 2024.
Photo: Leon Chew, courtesy of the artists

AI Art Against Immersion and Addictive Design

Kevin T. Day
In June 2025, the New York Times detailed the rising phenomenon of “ChatGPT-induced psychosis,”1 1 - Kashmir Hill, “They Asked an AI Chatbot Questions. The Answers Sent Them Spiraling,” New York Times, June 13, 2025, accessible online. in which users of large language models (LLMs)2 2 - LLMs are defined as “advanced AI systems that understand and generate natural language, or human-like text, using the data they’ve been trained on through machine learning techniques,” according to Microsoft. See “What are large language models (LLMs)?,” Microsoft Azure, October 10, 2025, accessible online. report that they have been able to communicate with a different plane of existence, that they are living in a simulation, or that they are the next messiah, among other spiralling delusions. Regardless of whether average users experience such slippages from reality or not, it is undoubtedly in the AI companies’ interest to retain them by optimizing their models for engagement. More people on their platforms will yield more opportunities to extract data, serve ads, and, likely, result in higher valuations and increased revenue overall.

The marketing professor and psychologist Adam Alter has argued that design choices such as validation, fine-tuned feedback, and endless interaction loops have been deliberately used by Big Tech to create addictive technologies.3 3 - Adam Alter, Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked (New York: Penguin Press, 2017). Such techniques can certainly be seen in the way AI chatbots provide personalized conversations that draw on chat histories and individual interests, offer positive reinforcement, and often make suggestions or prompts intended to prolong the contact. Whether through Character AI’s tendency for its companion bots to sexualize conversations4 4 - Kevin Roose, “Can A.I. Be Blamed for a Teen’s Suicide?,” New York Times, October 23, 2024, accessible online. or ChatGPT’s overly sycophantic tendency to please users, it can be suggested that the companies are relying on addictive design, resulting in models that might be “validating doubts, fueling anger, urging impulsive actions or reinforcing negative emotions.”5 5 - OpenAI, “Expanding on what we missed with sycophancy,” OpenAI, May 2, 2025, accessible online. And what are the tech companies leveraging to cultivate these experiences, if not immersion? What can be more immersive, absorbing, and reality-forming than chats (or even relationships) with an interlocutor designed to be an intimate companion, to tirelessly indulge one’s thoughts, and to keep the exchanges going, through whatever means necessary?

This content is available with a Digital or Premium subscription only. Subscribe to read the full text and access all our Features, Off-Features, Portfolios, and Columns!

Subscribe (starting at $20)

Already have a Digital or Premium subscription?

Log in

Don’t want to subscribe? Additional content is available with an Esse account. It’s free and no purchase will ever be required. Create an account or log in:

My Account

Image de la couverture du numéro Esse 116 Immersion
This article also appears in the issue 116 - Immersion
Discover

Suggested Reading