Hate speech and misinformation on social media are out of control — here’s what we should do about it


“There comes a time when free speech can cause real-life harm and that’s when social media platforms need to react,” said former Danish PM and current chair of Meta’s Oversight Board, Helle Thorning-Schmidt, during a session at TNW Conference 2024.

There’s no better example of this than what’s happening in the UK right now.

As the country grapples with an eruption of violence from extreme right groups at asylum shelters across the country, PM Keir Starmer made a point to address not only those inciting violence but also the platforms that enable them to spread misinformation and organise.

“Let me also say to large social media companies, and those who run them, violent disorder clearly whipped up online: that is also a crime. It’s happening on your premises, and the law must be upheld everywhere,” he said in a press conference last week.

The violence erupted after a knife attack by a 17-year-old from Cardiff resulted in the murder of three young girls on July 29th. According to an analysis by Reuters, before information about the suspect was released by authorities, claims that he was an asylum seeker or immigrant had already been viewed at least 15.7 million times across X, Facebook, Instagram, and other platforms.

Far-right activist groups then organised sharing a list of 36 targets including immigration centres, law offices specialising in helping asylum seekers, and asylum shelters.

Adding to Starmer’s call, yesterday North East mayor Kim McGuinness, whose area was racked by violence, urged ministers to clamp down on “distant and unaccountable social media companies” that have allowed hate speech and disorder to spread online.

Ironically, Elon Musk himself, owner of X, jumped into the fray earlier this week posting “civil war is inevitable” in response to a post blaming the violent demonstrations on the effects of “mass migration and open borders.” The post has so far received 9.6M views.

Elon Musk's Tweet saying "Civil war is inevitable" in response to far right violence in the UK.Elon Musk's Tweet saying "Civil war is inevitable" in response to far right violence in the UK.

Meanwhile, Reform UK leader Nigel Farage is now backfooting on a video he released on X the day after the stabbing citing “some reports” that suggested the suspect was a migrant known to security services. A week later he has now recanted his statement, admitting his “source” was extremist influencer Andrew Tate.

The balance between free speech and hate speech

This is just one of many examples across the globe of the shitstorm that can spiral out of control in the age of social media. The world now finds itself at the mercy of these platforms and their decisions on how best to address the balance between free speech and hate speech.

As a former PM and current member of Meta’s Oversight Board, Helle Thorning-Schmidt, was in a unique position to share her insights in a talk titled Democracy in Jeopardy: 2 Million votes in 2024.

“I think we’ll never see an election again without manipulated media,” Thorning-Schmidt said as voters across the globe are headed to the polls. This year will be the ultimate testbed for social media platforms on how to deal with misinformation (including the rise of deepfakes) being spread by both ordinary users and public figures. Should they be treated differently?

And with such large global numbers of users, how can these platforms stay on top of content moderation? While data shows X has the lowest number of content reviewers per users in Europe, platforms like Meta, with 15,000 content reviewers working across both IG and Facebook (which both have 260 million EU users) — don’t exactly have an easy task of it. How can we leverage AI without completely leaving this extremely important task in the hands of bots?

Find out what Thorning-Schmidt and Murad Ahmed, Technology News Editor for the Financial Times, had to say on these topics and more on TNW All Access.



Source link