60 minutes – that’s all the time social media platforms now have to remove extremist content or terrorism-related content from their site after being notified about it, according to a proposal presented to the European Parliament.
Remove extremist content or face fines: European Union
European Commission President Jean-Claude Juncker said in his last State of the Union address that social networking sites which do not comply would be subject to strict sanctions. In 2016, the EU executive first presented its code of conduct on countering hate speech. At that time, the four major internet platforms participating in the code of conduct were Microsoft, Twitter, YouTube and Facebook.
In 2017, EU released a press release which explained what illegal hate speech is and what content would lead to censorship. It noted that the goal of the Code is to ensure that requests to remove content are dealt with speedily: “The companies have committed to reviewing the majority of these requests in less than 24 hours and to removing the content if necessary.”
This year in March, the European Union gave the social media platforms three months to show that they were taking actions to remove radical posts. Those three months and an additional two months are now over, and regulators have now decided that the tech companies, which have taken down 70% of content reported as illegal hate speech within a day, have not done enough to take down extremist posts.
The latest proposal demands immediate action on extremist content
The latest proposal has made it quite clear that prompt action is a must if companies want to avoid unfavorable consequences. According to Juncker, EU Member States will be required to “follow-up with non-compliance with an effective, proportionate and dissuasive round sanctions aimed at eliminating any content that is associated with terrorist organisations.”
As for the fine, a company which fails to follow the rules would be subject to a fine of around 4% of its total annual turnover for the previous financial year–not a small sum even for tech giants like Facebook and Twitter!
In its statement, the Commission said, “By setting a minimum set of duties of care on hosting service providers which includes some specific rules and obligations, as well as obligations for the Member States, the proposal intends to increase the effectiveness of current measures to detect, identify and remove terrorist content online without encroaching on fundamental rights, such as freedom of expression and information.”
Juncker emphasized the importance of ensuring that terrorists are prosecuted across Europe and beyond, because terrorists know no borders.
- Hackers Access Personal Messages of Over 81,000 Facebook Users - January 7, 2019
- Questioning Facebook’s New Transparency Rules - December 26, 2018
- Facebook Deletes Over 1.5 Billion Fake Accounts in Just 6 Months - December 18, 2018
- Brazil Revenue Department Wants to Monitor Crypto Firms - December 10, 2018
- Facebook setting up a task force for 2019 Lok Sabha elections - November 27, 2018
- Revisiting the #DeleteFacebook Trend - November 19, 2018
- White House to Investigate Tech Giants about Antitrust - November 13, 2018
- New Facebook Feature to Identify Fake News - November 6, 2018
- Can Facebook and Twitter Team Up to Fight Hate Speech? - October 30, 2018
- EU Gets Strict Over Extremist Content on Social Media Platforms - October 23, 2018