Facebook adopts a new troll battling technique — but it’s got bigger problems from its past

Facebook says that it now uses the same techniques as bot removal to combat human trolls and spammers. The company told Reuters that this new approach is similar to what its security team uses to shut down networks of Russian troll farms.

The social network is saying it’s handling notorious groups that engage in false mass reporting of posts or accounts to get them removed by moderators. It’s also targeting networks trying to cause ‘social harm,’ on and off platforms.

]The company removed one such group based in Germany this week that was trying to spread misinformation and create conspiracy theories about the country‘s COVID-related restrictions.

The Reuters report noted that the company is looking to find the core of these campaigns and their network effects, rather than targeting individual posts.

After the tumultuous 2016 US Presidential elections, Facebook was accused of playing a massive role in facilitating misinformation campaigns that resulted in Donald Trump’s victory. In the aftermath of that exposé, the company changed its tactics for handling groups spreading misinformation and creating controversial theories about politicians and activists.

Facebook's "indefinite" ban on Donald Trump could be lifted by the company's Oversight Board.

Facebook's "indefinite" ban on Donald Trump could be lifted by the company's Oversight Board.

In a series of investigations published by the Wall Street Journal under the project name Facebook Files, some striking details of how the company is faltering at making its platform healthier have emerged.

One of the reports highlighted details from an internal document describing how Facebook is ignoring employees’ flagging of pages belonging to or run by drug cartels, human traffickers, and arms dealers. It notes that while some of these pages operating in developing countries are removed, many more are operating freely, and the social network is not paying attention.

Another report mentions that in 2018, Facebook tried to change its algorithm to increase “meaningful interactions” between friends and family, but ended up making them argue with each other.

Over the years, Facebook has grown into a behemoth of a social network that has the power to change social and political streams in different parts of the world. While it’s not easy to weed out harmful content and groups amongst millions of posts, the company needs to put its billions of dollars of revenue to use to maintain a healthy network.

Time and time again, internal documents and investigations have proven that in several instances, the company was just sitting around and waiting for someone to point out that they faltered. Right now, it seems like it’s doing a better job announcing its remedial actions.

Did you know we have a newsletter all about consumer tech? It’s called Plugged In – and you can subscribe to it right here.

Leave a Reply

Your email address will not be published. Required fields are marked *