Facebook Released The Second Quarter’s Community Standards Enforcement Report

Recently, Facebook has released Community Standards Enforcement Report including second quarter of 2021. In the report, actions taken in the second quarter of 2021 are mentioned. Since Facebook has more than 2 billion users, it is very important to take the necessary precautions for the reliability of the platform. With this report, Facebook has shown that the platform is doing a lot to keep it safe.

The most striking thing in the report was the actions taken for the posts related to Covid-19. According to the report, more than 2 million pieces of content containing misinformation or clutter related to Covid-19 have been removed. In addition to the content removal, 3000 accounts were closed for giving misinformation about covid 19. In its report, Facebook mentioned the following about the Covid-19 misinformation:

“We displayed warnings on more than 190 million pieces of COVID-related content on Facebook that our third-party fact-checking partners rated as false, partly false, altered or missing context, collaborating with 80 fact-checking organizations in more than 60 languages around the world. When they rate a piece of content with one of these ratings, we add a prominent label warning people before they share it and show it lower in people’s feed.”

Apart from Covid-19, the report also mentions topics such as hate speech, child safety, and recent trends on Facebook. In the report, which states that hate speech has decreased even more compared to the previous quarter, it is said that more than 35 million hate speech related content has been removed from Facebook. About child safety, saying that this is one of the most important issues for them, Facebook states that they have taken a lot of precautions in this regard. The report contains more detailed information on child safety than previous reports to be more transparent about the issue. Facebook states the following regarding the actions for child safety:

“In Q2 2021, we improved our proactive detection technology on videos and expanded our media-matching technology on Facebook, allowing us to remove more old, violating content. Both enabled us to take action on more violating content.”

Working to ensure the reliability of the platform, Facebook is also very aware of other social media features and continues to incorporate them. With the popularity of short videos thanks to TikTok, the Reels feature came to Instagram. Now, this “Reels” feature is on Facebook. Facebook, which has added the “Stories” feature that entered our lives with Snapchat before this feature, seems to continue to improve itself. While the Reels feature is only available in the US at the moment, we expect it to be rolled out to the world soon.


What do you think about the report published by Facebook? Do you think necessary precautions have been taken or could Facebook have done more if it hadn’t been dealing with the “Reels” feature? Leave a comment down below or hit us up on our socials! Stay tuned for more news on social media!