Science and TechSocial Media

Actions

FTC: Facebook failed to protect children's privacy

This is the third time the FTC has acted against Meta over allegations of not protecting users' privacy.
Facebook's Messenger Kids application on an iPhone.
Posted

Facebook has been accused of misleading parents and failing to safeguard children's privacy on its Messenger Kids app.

The Federal Trade Commissionfound that Meta — Facebook's parent company — had "misrepresented the access" granted to app developers for users' private dataand "misled parents about their ability to control with whom their children communicated."

As a result of its findings, on Wednesday the FTC proposed significant changes to a 2020 privacy order with Meta which would restrict it from profiting from data it gathers on users under 18, including via its virtual-reality products.

Additionally, Meta would have to comply with other restrictions, such as being required to disclose and obtain users' consent to use facial-recognition technology and offer further protections for its users.

Meta would also be "prohibited" from releasing new products or services without "written confirmation from the assessor that its privacy program is in full compliance" with the order, and it would need to ensure compliance with the FTC order for any companies it acquires or merges with and uphold the acquired companies' previous privacy commitments.

This is the third time the FTC has acted against Meta over allegations of not protecting users' privacy.

Someone on the TikTok app

TikTok fined $15.9M by UK watchdog over misuse of kids' data

The watchdog says TikTok allowed as many as 1.4 million children in the U.K. under 13 to use the app, despite the platform's own rules.

LEARN MORE

"Facebook has repeatedly violated its privacy promises," said Samuel Levine, Director of the FTC's Bureau of Consumer Protection. "The company's recklessness has put young users at risk, and Facebook needs to answer for its failures."

The 2020 order was put in place after Facebook violated a 2012 order barring the company from misrepresenting its privacy practices. The FTC said "today's action alleges" that Facebook failed to fully comply with the 2020 order and violated the Children's Online Privacy Protection Act Rule.

Under the 2020 privacy order, Facebook was required to pay a $5 billion fine and expand its privacy program, including third-party assessments of its effectiveness, but according to the FTC, the assessor identified gaps.

"The independent assessor, tasked with reviewing whether the company's privacy program satisfied the 2020 order's requirements, identified several gaps and weaknesses in Facebook's privacy program, according to the Order to Show Cause, in which the Commission notes that the breadth and significance of these deficiencies pose substantial risks to the public," said the FTC.

In 2017, Facebook introduced Messenger Kids as a means for children to communicate with family members and parent-approved friends, but according to the FTC, some children were able to communicate with unapproved contacts in group text messages and video calls.

The proposed changes to the 2020 order would apply to Facebook, Instagram, WhatsApp, and Oculus.

Meta has 30 days to respond to the FTC's findings.

Utah Gov. Spencer Cox

Utah law aims to curb teen use of TikTok, Instagram

Utah's governor has signed sweeping legislation aimed at trying to significantly lower use of TikTok and Instagram among young people.

LEARN MORE