Google and Facebook have responded to mounting criticism that they let fake news affect the presidential election.
On Monday, Google said it would ban sites that spread fake content from using its AdSense advertising network. Facebook responded by adding fake news to the list of things banned under in its advertising policies.
Essentially, the goal is to stop fake news from making money from advertising clicks.
Fake stories may have spread quicker than ever during this election. For instance, the false claim that Pope Francis endorsed Donald Trump was shared on Facebook nearly a million times.
The internet spread reports that Hillary Clinton spent $200 million on an estate in the Maldives and that Trump called Republicans the "dumbest group of voters" back in 1998.
Clinton's chief digital strategist put particular blame on Facebook for the election's outcome, arguing Google's ranking of search results is better than the way users can share stories on Facebook.
But Google's search engine still got its share of criticism when a top result claimed Trump won the popular vote.
Wiping out ad revenue could cut away the incentive to create fake news, but detecting false stories may still be a problem for both companies.
SEE MORE: Zuckerberg Says It's 'Crazy' To Blame Facebook For Election Results
Some experts have said algorithms alone aren't very good at finding those stories and human-help isn't very efficient.