Facebook Account Blocking before Elections

Posted on

This post looks back a bit to last year’s U.S. elections and the issue of account blocking by Facebook to protect the electoral process.

In a blog on account blocking of suspicious accounts, Facebook reveals how its partnerships with law enforcement and its own investigations are helping it find and eliminate fake accounts. Its efforts to discourage election meddling appear to be working.

Facebook wants people to know it’s working

In its election update, Facebook’s Head of Cybersecurity Policy, Nathaniel Gleicher, wrote that the company is in regular contact with outside experts, law enforcement and other companies worldwide. Further, the blog post revealed that Facebook’s own investigations as well as its partnerships have aided it in finding and eliminating suspicious accounts on several occasions last year.

On October 26, Facebook removed around 82 Facebook groups, pages and accounts linked to Iran. The company may be deleting bad actors from its platform, but it needs to realize that account blocking alone is not sufficient, as cybercriminals can come up with more accounts to replace the previous ones.

The tech company said that because of suspected connections to foreign actors, it removed an unspecified number of additional accounts on U.S. Election Day. The suspicious accounts were trying to interfere with the voting through fake news and disinformation on social media. Earlier this week, Facebook closed down around 115 accounts and shut down another 652 accounts, groups and pages in August. A site which claims to be associated with the IRA (Internet Research Agency) posted a list of fake Instagram accounts that were opened before the US midterm elections, some of which were eliminated by Facebook recently. The IRA, which is a Russia-based agency, had earlier been linked to influencing the 2016 presidential election.

US-based tech companies have increased their efforts to fight fake news and disinformation campaigns by Russian groups.

Timely reminder that bad actors won’t give up

Facebook’s head of cyber security policy further wrote that “This is a timely reminder that these bad actors won’t give up — and why it’s so important we work with the US government and other technology companies to stay ahead.”

Some months ago, a study was commissioned by Sam Gill of the nonprofit John S and James L Knight Foundation on misinformation on social media. Gill said that the leaders of the companies don’t say any more that misinformation isn’t a problem, but talk about how important it is to get it right and about how they cannot declare victory yet.

The study also revealed that fake news is not just being spread on Facebook but also on Twitter and other social media venues. Gill said, “We need a lot more basic research studying the relationship between social media and democracy. We need to see more and understand more from the companies. We need access to more information.”

Continue reading