The U.K. Riots Were Fomented Online. Will Social Media Companies Act?

A group of police officers in riot gear, holding shields, on a residential street. A car is on fire behind them.

Standing in front of a lectern on Thursday, his voice at times taut with anger, Britain’s prime minister announced a crackdown on what he called the “gangs of thugs” who instigated violent unrest in several towns this week.

But the question of how to confront one of the key accelerants — a flood of online misinformation about a deadly stabbing attack — remained largely unanswered.

Prime Minister Keir Starmer called out online companies directly, after false information about the identity of the 17-year-old suspected in the attack spread rapidly on their platforms, no matter how many times police and government officials pushed back against the claims.

Three girls died after the attacker rampaged through a dance class in Southport, northwest England, on Monday. Of the eight children injured, five remain in the hospital, along with their teacher, who had tried to protect them.

Immediately after the attack, false claims began circulating about the perpetrator, including that he was an asylum seeker from Syria. In fact, he was born in Cardiff, Wales, and had lived in Britain all his life. According to the BBC and The Times of London, his parents are from Rwanda.

The misinformation was amplified by far-right agitators with large online followings, many of whom used messaging apps like Telegram and X to call for people to protest. Clashes followed in several U.K. towns, leading to more than 50 police officers being injured in Southport and more than 100 arrests in London.

On Friday evening, a charged riot broke out in the working-class city of Sunderland in England’s northeast, during which police officers were injured and at least eight people were arrested, according to the local police. Footage of the unrest showed protesters hurling rocks, cars set ablaze and a police station engulfed in flames.

Officials fear more violence in the days ahead. The viral falsehoods were so prevalent that a judge took the unusual step of lifting restrictions on naming underage suspects, identifying the alleged attacker as Axel Rudakubana.

“Let me also say to large social media companies and those who run them: Violent disorder, clearly whipped up online, that is also a crime, it’s happening on your premises, and the law must be upheld everywhere,” Mr. Starmer said in his televised speech, though he did not name any company or executive specifically.

“We will take all necessary action to keep our streets safe,” he added.

Image

Keir Starmer stands in front of a wooden lectern with two U.K. flags behind him at a news conference with journalists, who are raising their hands.
Prime Minister Keir Starmer of Britain at a news conference in London on Thursday following clashes after the Southport stabbings.Credit…Henry Nicholls/Agence France-Presse — Getty Images

The attack in Southport, England, has been a case study in how online misinformation can lead to actual violence. But governments, including Britain, have long struggled to find an effective way to respond. Policing the internet is legally murky terrain for most democracies, where individual rights and free speech protections are balanced against a desire to block harmful material.

Last year, Britain adopted a law called the Online Safety Act that requires social media companies to introduce new protections for child safety, while also forcing the firms to prevent and rapidly remove illegal content like terrorism propaganda and revenge pornography.

But the law is less clear about how companies must treat misinformation and incendiary, xenophobic language. Instead, the law gives the British agency Ofcom, which oversees television and other traditional media formats, more authority to regulate online platforms. Thus far, the agency has not taken much action to tackle the issue.

Jacob Davey, a director of policy and research at the Institute for Strategic Dialogue, a group that has tracked online far-right extremism, said many social media platforms have internal policies that prohibit hate speech and other illicit content, but enforcement is spotty. Other companies like X, now owned by Elon Musk, and Telegram have less moderation.

Leave a Comment

Your email address will not be published. Required fields are marked *

*
*