Science and TechSocial Media

Actions

Psychology group raises concerns about youth social media use

Lawmakers have introduced multiple bipartisan bills directed at stopping online child exploitation, harmful eating disorder content and more.
Posted
and last updated

A leading psychology group is renewing its warnings about the dangers of social media for kids and teens in a new report Tuesday, pushing for legislators and tech companies to fix design features they call “inherently unsafe."

The report says tech companies and legislators have made “few meaningful changes” since the APA issued a health advisory last year, warning about social media’s impact on adolescents’ social, educational, psychological, and neurological development.

The U.S. Surgeon general says kids and teens who use social media for more than 3 hours a day are doubly at risk for having mental health issues, like anxiety or depression, and said teens spend an average of 3.5 hours a day on social media.

“The region that makes us very excited about getting attention from our peers develops first around 10, 11, 12 — unfortunately, the time when a lot of kids get a device,” Mitch Prinstein, the APA’s chief science officer, told Scripps News. “And the part that helps us resist following every impulse, kind of the brain’s brakes, that doesn't fully develop until our mid-20s.”

The report says features like infinite scrolling, likes, follower counts, and push notifications are “inherently unsafe” for kids and their developing brains.  Because kids are so hypersensitive to praise and attention at that age, they APA says they’re more influenced by social feedback such as likes and follower count, which studies have shown can make teens anxious and depressed, and cause more emotional distress.

The APA also wrote that push notifications “capitalize” on kids' “sensitivity to distraction,” and can drastically impact the amount of sleep kids get during puberty.

“I can say that as long as the platforms are designed to increase kids' time spent on social media as a primary objective, that means that we're not spending enough time on making sure that what they see is appropriate [or] that how long they spend is still preserving the recommended number of hours of sleep for adaptive brain development,” Prinstein said.

While the report didn’t mention specific social media sites, Prinstein also urged companies to build time limits in the apps, and called for hateful and discriminatory content to be taken down from apps entirely.

“I am very concerned about folks who are part of the design of social media who now admit they won't let their own kids anywhere near it. I think that's a really important message for all parents to know and to prompt policymakers to work fast,” he said.

Lawmakers have introduced multiple bipartisan bills directed at stopping child exploitation online, from blocking harmful content about eating disorders and suicide, to allowing some victims of exploitation to sue companies in civil court.

In October, antitrust and tech scholar Tim Wu estimated Congress has held around 40 hearings on children and social media since 2017. Meanwhile, Politico reports congress has only passed one online-related child safety law in the past decade, and that was only specific to sex trafficking.

Prinstein says it’s important to remember kids are learning how to interpret what they see all around them, including social media. The report notes that just because a child is a certain age, that doesn’t mean they’re ready to be online. Prinstein recommends only letting children download apps once they "understand that not everything they see is real" and know "how to resist comparing themselves with others.”

The APA website has a list of ways to start the conversation about safe social media use with children. Prinstein says, they may even thank you for it. “We actually had a class of kids in their early 20s who said that they really wish their parents had not given them a device at the age of 12, even though they had asked for one,” he said. “[What] we're hearing from kids is that when parents do set these limits that actually provides some relief.”

Scripps News reached out to X, formerly known as Twitter, and Meta, the parent company of Facebook, Instagram, and Whatsapp. Only Snap, which runs Snapchat, responded to our requests for comments, and said through a spokesperson “the app doesn’t offer public comparison metrics when you talk with your friends and our content platform is moderated, which means we don’t allow unvetted content to reach lots of people.”

Instagram testing tools to block nude direct messages
The Instagram logo is seen on a cell phone.

Instagram testing tools to block nude direct messages

Meta is making changes in how nude and sexually explicit images are disseminated amid legal and legislative pressure.

LEARN MORE