Community Standards Enforcement Report, May 2020 Edition

Wednesday, 13 May 2020 09:07

Today Facebook is publishing the fifth edition of their Community Standards Enforcement Report, providing metrics on how well they enforced their policies from October 2019 through March 2020.

The last few years were spent building tools, teams and technologies to help protect elections from interference, prevent misinformation from spreading on their apps and keep people safe from harmful content.


So, when the COVID-19 crisis emerged, Facebook had the tools and processes in place to move quickly and were able to continue finding and removing content that violates their policies.

When content reviewers were temporarily sent home due to the COVID-19 pandemic, reliance on these automated systems were increased and prioritized high-severity content for Facebook’s teams to review in order to continue to keep their apps safe during this time.


The report includes data only through March 2020 so it does not reflect the full impact of the changes made by Facebook during the pandemic.


It is anticipated that the impact of those changes will be visible in the next report, and possibly beyond, and Facebook will be transparent about them.


For example, for the past seven weeks always offering the option to appeal content decisions and account removals was not possible, so it is expected that the number of appeals to be much lower in the next report.


At Facebook, removing of harmful content was prioritized over measuring their efforts, and as a result may not be able to calculate the prevalence of violating content during this time.


Today’s report shows the impact of advancements made in the technology used to proactively find and remove violating content.


What’s New in This Report?


Now, metrics are being included across twelve policies on Facebook and metrics across ten policies on Instagram.


The report introduces Instagram data in four issue areas: Hate Speech, Adult Nudity and Sexual Activity, Violent and Graphic Content, and Bullying and Harassment.


For the first time, data on the number of appeals people make on content that has been taken action against on Instagram, and the number of decisions they overturn either based on those appeals or when they identify the issue themselves will be shared.


Facebook has also added data on their efforts to combat organized hate on Facebook and Instagram.


You can learn more about these efforts and the progress that has been made here. [link to hate orgs NRP].


Progress in Finding and Removing Violating Content


Facebook’s technology that proactively finds violating content, had been improved upon, which helped them remove more violating content so fewer people saw it.


• On Facebook, they continued to expand their proactive detection technology for hate speech to more languages, and improved existing detection systems.


The proactive detection rate for hate speech increased by more than 8 points over the past two quarters totaling almost a 20-point increase in just one year.


As a result, Facebook is able to find more content and can now detect almost 90% of the content removed before anyone reports it.


In addition, thanks to other improvements made to the detection technology, the amount of drug content removed in Q4 2019 doubled, removing 8.8 million pieces of content.

• On Instagram, improvements were made to their text and image matching technology to help find more suicide and self-injury content.


As a result, there was an increase in the amount of content action was taken on by 40% and increased proactive detection rate by more than 12 points since the last report.


Progress was also made in combating online bullying by introducing several new features to help people manage their experience and limit unwanted interactions, and announced new Instagram controls today.


Facebook is sharing enforcement data for bullying on Instagram for the first time in this report, including taking action on 1.5 million pieces of content in both Q4 2019 and Q1 2020.

• Lastly, improvements to Facebook’s technology for finding and removing content similar to existing violations in their databases helped take down more child nudity and sexual exploitative content on Facebook and Instagram.


[insert graphic highlighting some of the stats]


Over the last six months, Facebook has started to use technology more to prioritize content for their teams to review based on factors like virality and severity among others.


Going forward, leveraging technology is in plan and to also take action on content, including removing more posts automatically.


This will enable Facebook’s content reviewers to focus their time on other types of content where more nuance and context are needed to make a decision.


The Community Standards Enforcement Report is published in conjunction with Facebook’s bi-annual Transparency Report [link to Transparency Report] that shares numbers on government requests for user data, content restrictions based on local law, intellectual property take-downs and internet disruptions.


In the future Community Standards Enforcement Reports will be shared quarterly, therefore the next report will be released in August.

Last modified on Wednesday, 13 May 2020 09:18