60 minutes – that’s all the time social media platforms now have to remove extremist content or terrorism-related content from their site after being notified about it, according to a proposal presented to the European Parliament.
Remove extremist content or face fines: European Union
European Commission President Jean-Claude Juncker said in his last State of the Union address that social networking sites which do not comply would be subject to strict sanctions. In 2016, the EU executive first presented its code of conduct on countering hate speech. At that time, the four major internet platforms participating in the code of conduct were Microsoft, Twitter, YouTube and Facebook.
In 2017, EU released a press release which explained what illegal hate speech is and what content would lead to censorship. It noted that the goal of the Code is to ensure that requests to remove content are dealt with speedily: “The companies have committed to reviewing the majority of these requests in less than 24 hours and to removing the content if necessary.”
This year in March, the European Union gave the social media platforms three months to show that they were taking actions to remove radical posts. Those three months and an additional two months are now over, and regulators have now decided that the tech companies, which have taken down 70% of content reported as illegal hate speech within a day, have not done enough to take down extremist posts.
The latest proposal demands immediate action on extremist content
The latest proposal has made it quite clear that prompt action is a must if companies want to avoid unfavorable consequences. According to Juncker, EU Member States will be required to “follow-up with non-compliance with an effective, proportionate and dissuasive round sanctions aimed at eliminating any content that is associated with terrorist organisations.”
As for the fine, a company which fails to follow the rules would be subject to a fine of around 4% of its total annual turnover for the previous financial year–not a small sum even for tech giants like Facebook and Twitter!
In its statement, the Commission said, “By setting a minimum set of duties of care on hosting service providers which includes some specific rules and obligations, as well as obligations for the Member States, the proposal intends to increase the effectiveness of current measures to detect, identify and remove terrorist content online without encroaching on fundamental rights, such as freedom of expression and information.”
Juncker emphasized the importance of ensuring that terrorists are prosecuted across Europe and beyond, because terrorists know no borders.
- How to Become a Professional Esports Player - May 9, 2022
- Four Common Health Concerns for Esports Athletes - April 24, 2022
- Understanding the Esports Ecosystem - April 8, 2022
- Common Esports Stereotypes You Should Know About - March 24, 2022
- History of Esports and Competitive Gaming - March 8, 2022
- Why Esports Have Become Popular? - February 21, 2022
- The Impact of Esports in Education - February 6, 2022
- The Partnership of Esports and the Metaverse - January 20, 2022
- A Beginner’s Guide to Esports - January 5, 2022
- More European Countries Adopt the Green Pass: EU Digital Certificate - December 21, 2021