Facebook may end Covid censorship…

Meta is asking the Oversight Board for advice on whether measures to address dangerous COVID-19 misinformation, introduced in extraordinary circumstances at the onset of the pandemic, should remain in place as many, though not all, countries around the world seek to return to more normal life.

Misinformation related to COVID-19 has presented unique risks to public health and safety over the last two years and more. To keep our users safe while still allowing them to discuss and express themselves on this important topic, we broadened our harmful misinformation policy in the early days of the outbreak in January 2020. Before this, Meta only removed misinformation when local partners with relevant expertise told us a particular piece of content (like a specific post on Facebook) could contribute to a risk of imminent physical harm. The change meant that, for the first time, the policy would provide for removal of entire categories of false claims on a worldwide scale.

We are primarily funded by readers. Please subscribe and donate to support us!

As a result, Meta has removed COVID-19 misinformation on an unprecedented scale. Globally, more than 25 million pieces of content have been removed since the start of the pandemic. Under this policy, Meta began removing false claims about masking, social distancing and the transmissibility of the virus. In late 2020, when the first vaccine became available, we also began removing further false claims, such as the vaccine being harmful or ineffective. Meta’s policy currently provides for removal of 80 distinct false claims about COVID-19 and vaccines.



Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.