As artificial intelligence becomes more intertwined with everyday life, researchers are exploring its potential to support mental health. A study published in Applied Psychology: Health and Well-Being found that venting to an AI chatbot reduces high-intensity negative emotions like anger and frustration. However, it does not foster a sense of social support or reduce loneliness, highlighting both the promise and limits of this technology.
AI chatbots are advanced software programs designed to engage in natural, human-like conversations. Powered by sophisticated language models, these systems analyze user inputs, understand context, and generate meaningful responses. Over the years,technological improvements have enhanced chatbots’ ability to mimic human interaction.
The rationale behind the new study was to explore how effectively AI chatbots could replicate the psychological benefits of traditional venting methods like journaling or talking to a confidant. While venting has been shown to help individuals process emotions, its effectiveness often hinges on receiving validation or constructive feedback—elements human interactions typically provide.
“I have always found that talking about your frustrations with someone who listens and validates your feelings to be very comforting. However, with the increasing prevalence of loneliness across age groups, many individuals may lack access to a trusted, non-judgemental person to talk to,” said study author Meilan Hu, a psychology PhD candidate at Singapore Management University.
“Simultaneously, AI chatbots have become more advanced such that they are capable of providing human-like responses. This made me wonder, if they could serve as an alternative option for individuals to help process their emotions. After all, having an additional ‘support system’ that is available anytime and anywhere could be useful. Hence, this study served as an opportunity to explore whether AI chatbots could be a practical and effective way to reduce negative affect and improve emotional well-being.”
To compare the effectiveness of AI-assisted venting and traditional journaling, researchers recruited 150 university students in Singapore. The study employed a within-subject design, meaning all participants experienced both conditions. Participants were randomly assigned to either the AI-assisted venting or traditional journaling condition during the first session, with the conditions reversed in the second session after a one-week interval to avoid carryover effects.
In the traditional venting condition, participants were asked to write about a recent significant irritation or inconvenience in a Word document for 10 minutes. In the AI-assisted venting condition, participants described their negative experiences to an AI chatbot, designed to simulate a dynamic and empathetic conversation. Participants were instructed to interact with the chatbot as if texting a friend, engaging in back-and-forth dialogue.
After completing the venting activity in each session, participants filled out surveys assessing negative emotions, perceived stress, feelings of loneliness, and perceived social support. Attention checks were embedded in the surveys to ensure the reliability of responses.
The study found that venting to the AI chatbot reduced high and medium arousal negative emotions compared to traditional journaling. Participants reported feeling less anger and frustration after interacting with the chatbot. The researchers attributed this to the chatbot’s ability to provide real-time, personalized responses, which may have helped participants feel validated and encouraged open expression of emotions.
“Should you ever find yourself in need of a listening ear, AI chatbots may serve as a viable option,” Hu told PsyPost. “While they may not be able to replace the depth of connection you receive from human interactions, our findings still show that venting to a AI chatbots may effectively alleviate feelings like anger or fear. This makes AI chatbots a valuable tool for providing temporary emotional relief, especially in moments when you just need someone (or something) to talk to.”
However, the study revealed no significant difference between AI-assisted venting and traditional journaling in reducing low-arousal negative emotions, such as sadness. This could be because participants tended to recall high-intensity emotional experiences during the venting process, leaving low-arousal emotions less impacted.
While the AI-assisted venting was effective in managing negative emotions, it did not lead to improvements in perceived social support or reductions in loneliness. Participants likely recognized that the AI chatbot, despite its conversational abilities, was not a real person. This awareness may have diminished the sense of connection typically associated with social support from human interaction.
“I was rather surprised to find that AI chatbots did not significantly increase users’ perceived social support or decrease their feelings of loneliness,” Hu said. “This might be because users were ultimately aware that they were interacting with an inanimate entity, which may have limited their sense of emotional connection. This highlights a potential area for future research, to seek ways to make these interactions more genuine and meaningful.”
The study highlights the promising role of AI-assisted venting in reducing certain types of negative emotions. However Hu noted that “the long-term benefits of AI-assisted venting remain unexplored due to the study’s short time frame. Future research should explore whether the benefits of AI-assisted venting will endure over time. That said, our findings still offer a meaningful first step in understanding how AI chatbots may be serve as a valuable tool for individuals in improving their overall emotional well-being.”
Future research could address these limitations by incorporating more diverse samples, longer interaction periods, and comparisons between AI and human support. Exploring how AI chatbots can engage in other emotional processes, such as fostering gratitude or promoting self-compassion, could also provide a more comprehensive understanding of their potential benefits.
“I would like to explore how AI chatbots can support people in other aspects of their lives, such as fostering gratitude or encouraging self-compassion,” Hu said. “Furthermore, with AI chatbots being used for more personal interactions – e.g. a virtual companion or even a romantic partner – I would want to understand the broader implications of such relationships.”
“As AI become increasingly integrated into our daily lives, I believe it is crucial to understand how these tools can be responsibly used and designed in order to maximise their benefits while minimising its potential negative impacts.”
“As AI becomes more advanced and widespread, I believe it is crucial to prioritise research that explores its implications,” Hu explained. “This may help us better understand how to use AI responsibly and design it in a way that benefits users. By doing so, we can be better prepared for a future where AI may continue to play an even bigger role in our lives.”
The study, “AI as your ally: The effects of AI-assisted venting on negative affect and perceived social support,” was authored by Meilan Hu, Xavier Cheng Wee Chua, Shu Fen Diong, K. T. A. Sandeeshwara Kasturiratna, Nadyanna M. Majeed, and Andree Hartanto.