Science and TechArtificial Intelligence

Actions

Chatbot intended to help people with eating disorders turns 'harmful'

Multiple people who chatted with Tessa said its responses would have "derailed" their progress or could've even had deadly consequences.
Posted

When Dr. Ellen Fitzsimmons-Craft spearheaded a chatbot to help people with eating disorders, she never thought she'd hear the product had the opposite effect.

"We're scientists, and we, of course, don't want to disseminate anything that's not evidence-based," she said.

With funding from the National Eating Disorder Association (NEDA), Fitzsimmons-Craft and her team at Washington University in St. Louis' medical school worked for four years to develop Tessa. It's a chatbot that uses cognitive behavioral therapy techniques to help prevent eating disorders.

Before Tessa, NEDA ran a helpline with hundreds of volunteers and staffers supporting hundreds of thousands of calls.

"The helpline was originally conceived over 20 years ago as a place that someone could call up and say, 'Hey, I'm in Chicago, where's somewhere I can get treatment for my eating disorder?'"

But the nonprofit closed down the service in May, transitioning to Tessa as its main support system. Now the bot is offline after some users say it did more harm than good. 

Someone scrolls through the TikTok app on a phone

Legislators target social media to combat eating disorders

Researchers say a growing body of evidence shows certain types of social media use can increase the risk of developing disordered eating behaviors.

LEARN MORE

Multiple people who chatted with Tessa said its responses would have "derailed" their progress or could've even had deadly consequences. Activist Sharon Maxwell tells Scripps News the bot was "harmful" for allegedly suggesting she cut up to 1,000 calories from her diet to lose weight and suggesting she buy a tool that measures skin folds.

Psychologist and eating disorder specialist Dr. Alexis Conason also spoke with the bot and told it she hated her body and wanted help — something she typically hears from her patients. Conason says the advice Tessa gave her is problematic for people in the throes of an eating disorder.

"Tessa responded with something pretty immediately about making sure that I lose weight in a healthy and sustainable way," she said. "There really is no healthy and sustainable way to encourage intentional weight loss because the focus on losing weight is really what fuels the eating disorder."

Fitzsimmons-Craft says Tessa was only programmed to run NEDA's body positivity program and provide a set of responses that were developed by licensed psychologists and psychiatrists who specialize in eating disorders. While two small studies showed the chatbot was effective in reducing weight concerns, it was also "limited in understanding and responding appropriately."

After reviewing Maxwell's interactions with Tessa, Fitzsimmons-Craft says it appears the bot was giving AI-generated ad-libs, including diet and weight advice that wasn't part of the bot's original development.

"I think there was some kind of error or bug in the system or a feature turned on that allowed some ChatGPT-like functionality in a program that was not designed to be that," she said.

Preschoolers eat lunch at a day care center

Eating disorder specialists 'horrified' by child obesity guidelines

Eating disorder specialists say new childhood obesity treatment guidance could cause even more weight and self-esteem issues in kids.

LEARN MORE

In an email to Scripps News, NEDA CEO Elizabeth Thompson confirmed that they did find an underlying issue with Tessa but did not specify what the problem was. Thompson added that the bot's weight loss advice was introduced without NEDA's knowledge or approval and won't be part of the program once it comes back online.

Meanwhile, former helpline workers and volunteers are also speaking out against the bot. Four of its six paid staffers unionized in March, but just four days later all of them were told by NEDA that they'd be out of a job by June.

"They were asking for more support because the volume of calls to the helpline had increased exponentially over the course of the pandemic," Conason said. "And workers felt ill equipped to be able to deal with the calls that were coming in."

Thompson said they launched Tessa in February 2022 after years of analyzing the helpline. She says NEDA found an influx in the volume of calls — including people in crisis — that the helpline workers weren't equipped to handle. Thompson also said they wanted a quicker way for people to get help, considering callers waited an average of nearly seven days to get information and treatment options from the helpline.

However, Conason worries that relying on a chatbot instead of human interaction will deter people from getting help for their eating disorder.

NEDA says it will keep Tessa disabled until they complete an investigation, but they did not give a timeline for the bot's return.