Science and TechSocial Media

Actions

X to establish content moderation HQ to combat child exploitation

An X executive said the new 100-person team will be based in Austin, Texas, and will focus primarily on content moderation and safety on the platform.
File photo of the Twitter sign-in page with the new X logo.
Posted
and last updated

Elon Musk's social media platform X — formerly Twitter — is planning to create a new content moderation headquarters to police potentially harmful content shared on the site.

Joe Benarroch, the head of business operations at X, told Bloomberg that the company is aiming to hire 100 full-time employees by the end of the year for a "Trust and Safety center of excellence" that will be based in Austin, Texas. He said the group will focus primarily on preventing child sexual exploitation on the platform, but will also help enforce the company's rules and restrictions, including those related to hate speech and violence.

The timing of the announcement is notable, as it comes just days before X CEO Linda Yaccarino is due to appear before the Senate Judiciary Committee to discuss "Big Tech and the Online Child Sexual Exploitation Crisis." The heads of TikTok, Snapchat, Meta, and Discord are also expected to testify.

EU opens formal inquiry against X, testing new digital rights laws
Devices display the X login page

EU opens formal inquiry against X, testing new digital rights laws

The investigation is the first application of the EU's Digital Services Act, which is meant to protect against misinformation and online harm.

LEARN MORE

In an updated blog post, X said it has a zero-tolerance policy for child sexual exploitation on its platform and is constantly working on ways to improve how harmful content is detected and reported. The company added that it suspended more than 12 million accounts last year for violating the company's child sexual exploitation policies — up from 2.3 million accounts in 2022. 

"While X is not the platform of choice for children and minors — users between 13-17 accounts for less than 1% of our U.S. daily users — we have made it more difficult for bad actors to share or engage with CSE (child sexual exploitation) material on X, while simultaneously making it simpler for our users to report CSE content," the company stated. "In 2024, we will continue to share detailed updates about our investment in this area." 

X prohibits searches for Taylor Swift after deepfake images go viral
Taylor Swift.

X prohibits searches for Taylor Swift after deepfake images go viral

The social media platform is blocking searches of Taylor Swift after sexually explicit images of the pop star started circulating online.

LEARN MORE

Since acquiring the company in 2022 for $44 billion, Musk's X has faced some unsteady financial times and seen its estimated valuation slashed multiple times.  Yaccarino was hired last spring to oversee business operations, but was immediately faced with the daunting task of attracting more users and advertisers to a site that has fired much of its workforce, rolled back many of its content moderation policies and, in turn, raised advertisers' concerns about hateful speech on the platform negatively impacting their brands. 

While it remains unclear when X will open its new content moderation center in Texas, a job posting for a moderator position based in Austin says the person will be tasked with investigating issues that "may cause harm" to users or the company and "combating spam and fraud" on the platform.