Science and TechSocial Media

Actions

Facebook Papers: VP Warns Employees 'Brace For Bad Headlines'

Facebook had internal research that showed it only took two days for a new user to start seeing QAnon content if they looked at conservative news.
Posted
and last updated

Facebook Vice President Nick Clegg warned employees to brace for more bad headlines in an internal memo obtained by Axios. Documents from whistleblower Frances Haugen are revealing more information about how the company handled problematic content on its platforms before and after last year's elections. 

Our technology correspondent Tyler Adkisson joins us from Chicago.

DAVE BRIGGS: Tyler, good morning. The leak contains tens of thousands of documents. What are the new takeaways?

TYLER ADKISSON: So one of the biggest things that came out of this was that Facebook had internal research that showed that it only took two days for a new user to start seeing QAnon content if they were involved or looked at conservative news. Effectively, what they found was that in 2019 a researcher at Facebook created a fake account in order to test how the company's recommendation systems basically fed misinformation and polarizing content to them. And, you know, they found that it really only took two days for them to get involved in extremist ideology. At that point, additionally to that, they found that Facebook's policies and procedures failed to stop the growth of the "stop the steal" groups, which ended up contributing to a lot of the large crowds that we saw Jan. 6 at the Capitol insurrection. You know, according to these leaked documents, it found that basically — in part because of the user backlash — that when they started taking these safeguards off that more and more people are getting sent to these groups like that. And outside of the U.S., there were also a lot of issues about how they were handling content internationally, for instance, in India where Facebook has really struggled to deal with misinformation and hate speech they found that the largest — they had 340 million users there at a certain point — but what these documents show is basically all of safeguards that were put in place in order to make sure that hate speech and things didn't spread out in India did not work effectively. They found that while only 13% of Facebook's global budget was spent classifying this information, that it ended up exposing people to a lot more than that. 

BRIGGS: OK, Tyler. Recent reports indicate Facebook may be considering a name change. Any word on what they might change it to? And how is that perceived in the tech community? 

ADKISSON: So I think that the name change, at least not so much in the tech community, but with other businesses, it kind of shows that Facebook is in the position that say someone like Philip Morris was a couple years ago when they ended up changing their name. You know, I don't think that is necessarily going to stop people from knowing what they've been involved in. But down the line, I think that name change will probably — effectively, what they're trying to do is make it so that they will distance themselves from all this bad news coming out. Now, it's not clear exactly what the name change is, but if you look at some of the domains that Facebook owns right now, one of the possible names that they may be moving into is Meta. If you go to Meta.com it actually sends you back to a Facebook-owned site named Meta.org. And given the fact that Facebook is trying to move into a metaverse universe, it's likely that that could possibly be a name. It should be revealed at some point this week. But it's not exactly clear when that's going to happen or what the name is going to be.

BRIGGS: You can actually bet on what they will change their name to. There are actual betting odds out, and "meta" and "verse" are going to be part of it.