A recent TV documentary investigation has disclosed how Facebook decides what its users can and cannot see on its platform. The investigation covered topics including how the company deals with violent content, reported posts, and hate speech.
Violent Content and Hate Speech Remains on the Site
According to the investigation, despite being flagged by users, violent content like videos of assaults on children and graphic images remained on the site. The content was not deleted by Facebook even after users flagged it as inappropriate and requested that it be removed.
During filming, thousands of reported posts remained on the site and were not moderated. In addition, reported posts relating to self-harm and suicide threats remained on the site beyond the 24-hour turnaround that Facebook had stated as its goal for removal. Also, the pages that belonged to far-right groups with many followers were treated differently than those belonging to news organizations and governments, and were allowed to exceed the deletion time limits.
Facebook has a policy of not allowing children under age of 13 to have accounts, but a company trainer told an undercover reporter not to take any action against underage users unless they admit to being underage. For example, if there is an image of someone who looks underage and the image contains content for self-harm, then instead of reporting the account as an underage account, the account owner is treated like an adult and is sent information about organizations that help with issues relating to self-harm.
Facebook’s take on hate speech
The undercover reporter was told that content that racially abuses protected religious or ethnic groups violates Facebook guidelines, but if posts racially abuse immigrants from these groups, then it is permitted. Training for moderators included an example of a cartoon comment that described “drowning a girl if her first boyfriend is a negro”; the cartoon was deemed “permitted.”
In a talk with the television studio, the tech company reassured that the cartoon violates its guidelines and hate speech standards, and that it is reviewing the settings to prevent it from happening again.
- How Does Conflict Affect Food Security? - September 14, 2022
- Food Waste and Food Insecurity - August 30, 2022
- Poverty and Food Insecurity - August 10, 2022
- The Impact of Climate Change on Food Security - July 25, 2022
- What Is the Future of Esports after COVID-19? - July 10, 2022
- How Esports Is Poised To Take Over Traditional Sports - June 24, 2022
- How Esports Is Changing the Sports Industry - June 9, 2022
- 4 Tips for Becoming a Pro Esports Gamer - May 25, 2022
- How to Become a Professional Esports Player - May 9, 2022
- Four Common Health Concerns for Esports Athletes - April 24, 2022