SAN FRANCISCO — Facebook is planning to enact new measures to make it more difficult for election misinformation to spread virally across its platform, two people with knowledge of the matter said Thursday, as the outcome of presidential race remains uncertain.
Facebook plans to add more “friction” — such as an additional click or two — before people can share posts and other content, said the people, who requested anonymity because they were not authorized to speak publicly. The company will also demote content on the News Feed if it contains election-related misinformation, making it less visible, the people said.
The measures, which could be rolled out as soon as Thursday, are a response to heightened strife and social discord on Facebook, these people said. They said there had been more activity by users and Facebook groups to coordinate potentially violent actions over election issues such as voter fraud. President Trump has falsely claimed on social media over the past few days that the election is being “stolen” from him, even while a final result remains unclear.
Facebook has been more proactive about clamping down on misinformation in recent months, even as its chief executive, Mark Zuckerberg, has said he does not want to be the arbiter of truth. The company has suspended political advertising for an indefinite period, for example, and has introduced notifications at the top of the News Feed that say no winner has been called in the election.
Facebook also prepared for months for the election. It ran through dozens of possibilities of what might happen on Nov. 3 and afterward in case political candidates or others tried to use the platform to delegitimize the results. The new measures were part of this planning, the people said.
The newest measures would be some of the most significant steps taken by the company, which has in the past tried to make sharing information as easy as possible so that it can increase engagement on its site. The moves would most likely be temporary, said the people with knowledge of them, and were designed to cool down angry Americans who are clashing on the network.
“As vote counting continues, we are seeing more reports of inaccurate claims about the election,” Facebook said in a statement. As a result, it said, it is “taking additional temporary steps.”
Other social media companies have also spent the past few weeks slowing down the way information flows and highlighting accurate information on their sites. Twitter, which Mr. Trump uses as a megaphone, had labeled 38 percent of his 29 tweets and retweets since early Tuesday with warnings that said he made misleading claims about the electoral process, according to a tally by The New York Times.
Twitter instituted a similar feature to add “friction” to sharing in October, making it slightly more arduous for people to retweet posts or reshare links to stories that users had not yet read. And both Facebook and Twitter have also had to add labels to posts contaning false or misleading information around the election, including those made by Mr. Trump since the polls closed on Tuesday evening.
Facebook — and Mr. Zuckerberg in particular — has long been criticized from members of both major American political parties for the company’s stances on misinformation across the platform. Republicans feel like Facebook is being unduly critical of conservative speech, frequently crying censorship. Liberals often believe the tech companies are not doing enough to clean up the wealth of conf
But as recently as Thursday, Facebook began taking more aggressive action
Some of the measures being enacted by Facebook are not without precedent. In June, the company instituted similar stopgaps and added friction to misinformation related to COVID-19.
This is a developing story. Check back for updates.