60 minutes – that’s all the time social media platforms now have to remove extremist content or terrorism-related content from their site after being notified about it, according to a proposal presented to the European Parliament.
Remove extremist content or face fines: European Union
European Commission President Jean-Claude Juncker said in his last State of the Union address that social networking sites which do not comply would be subject to strict sanctions. In 2016, the EU executive first presented its code of conduct on countering hate speech. At that time, the four major internet platforms participating in the code of conduct were Microsoft, Twitter, YouTube and Facebook.
In 2017, EU released a press release which explained what illegal hate speech is and what content would lead to censorship. It noted that the goal of the Code is to ensure that requests to remove content are dealt with speedily: “The companies have committed to reviewing the majority of these requests in less than 24 hours and to removing the content if necessary.”
This year in March, the European Union gave the social media platforms three months to show that they were taking actions to remove radical posts. Those three months and an additional two months are now over, and regulators have now decided that the tech companies, which have taken down 70% of content reported as illegal hate speech within a day, have not done enough to take down extremist posts.
The latest proposal demands immediate action on extremist content
The latest proposal has made it quite clear that prompt action is a must if companies want to avoid unfavorable consequences. According to Juncker, EU Member States will be required to “follow-up with non-compliance with an effective, proportionate and dissuasive round sanctions aimed at eliminating any content that is associated with terrorist organisations.”
As for the fine, a company which fails to follow the rules would be subject to a fine of around 4% of its total annual turnover for the previous financial year–not a small sum even for tech giants like Facebook and Twitter!
In its statement, the Commission said, “By setting a minimum set of duties of care on hosting service providers which includes some specific rules and obligations, as well as obligations for the Member States, the proposal intends to increase the effectiveness of current measures to detect, identify and remove terrorist content online without encroaching on fundamental rights, such as freedom of expression and information.”
Juncker emphasized the importance of ensuring that terrorists are prosecuted across Europe and beyond, because terrorists know no borders.
- WhatsApp Vulnerability - June 24, 2019
- Facebook Faces a New Regulation Threat - June 17, 2019
- Everyone Vulnerable to Cyber Crime, Especially C-level Executives - June 5, 2019
- Twitter Progressing With its Experimental Program to Combat Hate Speech - May 28, 2019
- Singapore Submits Bill for Online Protection - May 22, 2019
- Cyber Security Breaches Survey 2019 - May 15, 2019
- Facebook Considering Restrictions in Its Live-Streaming Feature - May 8, 2019
- Crypto Exchange Bithumb Hacked in Suspected Inside Job - April 29, 2019
- Looking back: UK Far-Right Activist Banned from Facebook - April 15, 2019
- Facebook Suing Companies behind Fake Accounts - April 9, 2019