Facebook said Tuesday that it took down 7 million posts pushing COVID-19 misinformation from its main social media site and Instagram between April and June as the company tried to combat the rapid spread of dangerous information about the virus.

The company also put warning notes on 98 million COVID-19 misinformation posts on Facebook during that time period — labeling posts that were still misleading but not deemed harmful enough to remove.

Facebook and fellow big social media sites Twitter and YouTube have been scrambling to keep up with the flood of posts promoting fake cures or harmful speculation about the spread of the novel coronavirus since early this spring. Facebook put policies in place to try to regulate COVID-19 posts, but their moderation teams that monitor such posts have also been disrupted as offices remain closed.

Facebook sent its content moderators home in March, a move that led to fewer posts being removed in certain rulebreaking areas between April and June. But other policies benefited from improved artificial intelligence technology, and Facebook reported a bump in removing posts for violating some policies.

Tuesday’s report was Facebook’s sixth on how well its rules are being enforced.

The company took down 22.5 million posts on Facebook for violating its hate speech rules during the time period, an increase from 9.6 million posts during the first quarter of the year. Much of that increase was due to better detection technology and adding three languages to its automated system that searches for violating posts, Facebook said.

Advertising

Facebook is also once again expanding its definition of hate speech, it said Tuesday, to include more content depicting blackface and some harmful Jewish stereotypes.

But operations tremors caused by COVID-19 also meant Facebook had to prioritize some rules over others, and other metrics slipped. The company took down 35.7 million posts for breaking its rules about adult nudity and sexual activity, compared to 39.5 million in the first three months of the year. The change was because of “temporary workforce changes due to COVID-19,” Facebook wrote in the Community Standards Enforcement Report.

The company also called for an independent audit of its reports — which will be released quarterly from now on — a move that would give external organizations a peek under Facebook’s secretive hood.

“No company should grade its own homework, and the credibility of our systems should be earned, not assumed,” Facebook technical program manager Vishwanath Sarang wrote in a blog post announcing a request for proposals for the audit.

Despite the teams of thousands of content moderators, social media sites have still let coronavirus misinformation spread online. In May, Facebook and YouTube removed the so-called “Plandemic” video featuring a conspiracy theory about how COVID-19 spread, but it had already been viewed millions of times.

Facebook took down one of President Donald Trump’s posts for spreading coronavirus misinformation earlier in August, after the president posted a video of an interview he gave on Fox News. It was the first time the company had taken down one of the president’s posts for violating its COVID-19 misinformation policy. In the interview, he falsely claimed that children are “almost immune” from COVID-19.

Twitter also sanctioned the video on its site and required the Trump 2020 campaign account to delete a tweet with the same clip.