Meta

Independent Audit of Community Standards Enforcement Report Metrics

By Vishwanath Sarang, Technical Program Manager, Integrity

Update on May 17, 2022 at 10:00AM PT:

We have published EY’s assessment. Find more information about the results.

Update on October 20, 2021 at 11:00AM PT:

Over the past several months, we’ve walked EY through our processes and systems, and provided them with the data and evidence requested to conduct their assessment. EY’s work will cover our Community Standards Enforcement Report metrics from the fourth quarter of 2021. We anticipate their assessment will be completed in the spring of next year.

Update on May 19, 2021 at 3:45PM PT:

We have selected EY to conduct this assessment, and we look forward to working with them.

Originally published on August 11, 2020 at 9:00AM PT:

Today we published the sixth edition of our Community Standards Enforcement Report, outlining our progress on enforcing our Community Standards. We first began sharing metrics on how well we enforce our content policies back in May 2018 to track our work across 6 types of content on Facebook that violate our Community Standards. Today we report across 12 areas on Facebook and 10 on Instagram. 

Measuring and reporting out these numbers on a regular basis, with the same frequency as we report our earnings and business results, is important. It lets others see how we’re doing and hold us accountable – and it helps us measure progress and spot places where we’re not doing enough. 

But transparency is only helpful if the information we share is useful and accurate. In the context of the Community Standards Enforcement Report, that means the metrics we report are based on sound methodology and accurately reflect what’s happening on our platform. To this end, over a year ago we worked with international experts in measurement, statistics, law, economics and governance to provide an independent, public assessment of whether the metrics we share in the enforcement report provide accurate and useful measures of Facebook’s content moderation challenges and our work to address them. They broadly agreed that we are looking at the right metrics and provided some recommendations for improvement including launching appeals reporting. You can read the full report here.

Now we are taking it one step further to validate that the metrics are measured and reported correctly. Over the past year, we’ve been working with auditors internally to assess how the metrics we report can be audited most effectively. This week, we are issuing a Request For Proposal (RFP) to external auditors to conduct an independent audit of these metrics. We hope to conduct this audit starting in 2021 and have the auditors publish their assessments once completed.

No company should grade its own homework and the credibility of our systems should be earned, not assumed. We believe independent audits and assessments are crucial to hold us accountable and help us do better.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy