The murder of three girls at a Taylor Swift-themed concert in Southport, a coastal town in northern England, sparked outrage last week after misleading social media messages incorrectly labeled the suspected perpetrator as an Islamist migrant.
Protests by anti-Islam and anti-immigrant organizations expanded to other towns and cities in Britain, with mosques and hotels hosting migrants targeted, resulting in violent battles with police.
Jacob Davey, Director of Policy and Research at the Institute of Strategic Dialogue (ISD), stated that the flow of online disinformation, as well as the participation of social media companies themselves, were critical.
“I don’t think we can underestimate how important the transmission of this knowledge is to the horrible events of the weekend.
In response, the government, which has long accused countries like Russia of attempting to create strife, said it was investigating how much influence foreign powers have in spreading misleading propaganda.
“We’ve seen bot activity online, much of which may be amplified or involve state actors, exacerbating some of the disinformation and misinformation that we’ve seen,” a spokesperson for Prime Minister Keir Starmer told reporters.
“It is clearly something that is being looked at.”
Elon Musk, X’s owner, has also weighed in. In response to a post on X that blamed mass migration and open borders for the chaos in Britain, he said, “Civil war is unavoidable.”
Davey claimed that disinformation was distributed not only by people looking to cause trouble, but also by social media platforms themselves, thanks to their business models’ algorithms, which are designed to amplify a story online.
“You saw that in the hot topics in the UK, you saw that disinformation cropping up when you searched for Southport… “The business model aspect is extremely important.”
Disinformation was also spread by prominent anti-immigrant campaigners. Stephen Yaxley-Lennon, known by the moniker Tommy Robinson and former leader of the defunct anti-Islam English Defence League, has been accused by the media of disseminating falsehoods about X.
He was banned from the platform in 2018 for producing hateful content, according to media sources at the time, but was reinstated when Musk purchased it.
Platform for Hate
Last year, Britain enacted a new Online Safety Act to address concerns such as child sexual abuse and suicide promotion, but Professor Matthew Feldman, a right-wing extremism expert at the University of York, believes it may not be effective in this circumstance.
It did not appear to address “online incitement to offline criminality or disorder,” he stated.
According to Feldman, far-right groups are less organized than they were more than a decade ago, when the British National Party had thousands of members. Now, no organization had more than a few hundred.
Regardless, they are very noticeable. Extremists and influencers were using modern technology to gain attention, he claimed.
According to the ISD’s Davey, the incident did not come out of nowhere: there was turbulence outside immigration centers, disorder at last year’s Remembrance Day celebrations, and hundreds of people out in central London a few weeks ago to support Yaxley-Lennon.
“I think that this is really the accumulation of a much longer process whereby we’ve seen extremist groups become more confident,” he told us.


Facebook Comments