Facebook has eliminated over 1.5 billion fake accounts and over 2.1 billion spam pieces between April and September. Recently, the tech company shared the astonishing number of deleted fake accounts in its Community Standards Report.
Facebook gets better at identifying violating content
In May, Facebook shared numbers to show the amount of violating content that it detected on its services so people can see for themselves just how the tech company is actually doing.
In its recent blog post, with the headline, ‘How Are We Doing at Enforcing Our Community Standards?’, Facebook revealed that it has invested both in technology and people to remove content that is unlawful and fake. However, this is the first time that the tech company has published their internal guidelines. The review teams of Facebook use these guidelines to enforce its Community Standards.
Facebook said that it did this to allow its community to understand in a better way the kind of content that is allowed on the site and the reason it is allowed. The report shows how it has dealt with fake accounts and with problems like hate speech, pornography, violence, terrorism and spam accounts for the six months between April and September 2018.
Proactive detection rate for violent and graphic content increases by 25%: Report
In November, Facebook published another Community Standards Enforcement Report, which included two new categories of data – child pornography and sexual exploitation of children, and bullying and online harassment.
According to the blog, the amount of hate speech that Facebook detects proactively, before any user reports it, has increased to 52% from 24% since its last report. Additionally, its proactive detection rate for graphic and violent content increased to 97% from 72% – a hike of 25%.
The report reads that the company took some action in the third quarter this year on 15.4 million pieces of violent as well as graphic content. This “action” included eliminating content, disabling content that is offending, placing a warning screen over unlawful content, and turning over the content to law enforcement.
In addition to this, Facebook deleted more spam accounts in Q2 and Q3 in comparison to previous quarters – over 800 million (second quarter) and 754 million (third quarter) respectively. Guy Rosen, Facebook vice president of product management, said that the staff at Facebook understands that there is lot that they have to do when they talk about removing abuse from the social network.
- WhatsApp Vulnerability - June 24, 2019
- Facebook Faces a New Regulation Threat - June 17, 2019
- Everyone Vulnerable to Cyber Crime, Especially C-level Executives - June 5, 2019
- Twitter Progressing With its Experimental Program to Combat Hate Speech - May 28, 2019
- Singapore Submits Bill for Online Protection - May 22, 2019
- Cyber Security Breaches Survey 2019 - May 15, 2019
- Facebook Considering Restrictions in Its Live-Streaming Feature - May 8, 2019
- Crypto Exchange Bithumb Hacked in Suspected Inside Job - April 29, 2019
- Looking back: UK Far-Right Activist Banned from Facebook - April 15, 2019
- Facebook Suing Companies behind Fake Accounts - April 9, 2019