Facebook says it removed over 7 million pieces misleading or “harmful” COVID-19 related posts from its social network and the company-owned Instagram in the second quarter.
The company cited examples of posts that pushed “fake preventative measures or exaggerated cures that the CDC and other health experts tell us are dangerous.”
It also applied warning labels on about 98 million pieces of COVID-19 misinformation on Facebook, the company said.
Facebook releases updates to its Community Standards Enforcement Report every quarter.
Facebook admitted that COVID-19 stymied the company’s efforts on misinformation from April to June.
Sending its content moderators to work from home in March amid the pandemic led the company to remove less harmful material from Facebook and Instagram around suicide, self-injury, child nudity and sexual exploitation.
The COVID-19 pandemic affected Facebook’s ability to remove harmful and forbidden material from its platforms, the company said because people at home were uncomfortable looking at disturbing images with their families around.