HealthCoronavirus

Actions

Facebook Ramps Up Efforts To Stop Spread Of COVID-19 Misinformation

The social media company plans to alert users if they liked, reacted or commented on debunked content.
Posted

Facebook announced Thursday it's rolling out a new policy to stop the spread of misinformation about the coronavirus pandemic.

The social media giant says it will put a message on users' news feed if they liked, reacted or commented on debunked content. The message will direct them to the World Health Organization's "myth busters" website. Until this update, only users who shared misinformation were notified.

A blog post from the company's Vice President for Integrity said directing Facebook and Instagram users to information from credible health authorities is only "half the challenge." He said the company works with more than 60 fact-checking organizations to review content for removal due to false information. It's also given grants to 13 fact-checking organizations.

Additionally, Facebook recently banned all ads promising treatments or cures for COVID-19, because none have been approved by the FDA yet.

Facebook said users who have interacted with false content will start to see warnings "in the coming weeks."