web analytics

Amnesty slams Elon Musk’s X for ‘central role’ in fueling 2024 UK riots – Source: go.theregister.com

Rate this post

Source: go.theregister.com – Author: Connor Jones

Amnesty International claims Elon Musk’s X platform “played a central role” in pushing the misinformation that stoked racially charged violence following last year’s Southport murders.

Southport merseyside UK July 30th 2024, at riot police barricades

Southport, Merseyside, UK, on July 30th, 2024: Riot police form a barricade as protesters throw bricks and flaming objects – click to enlarge (Editorial credit: Ian Hamlett / Shutterstock.com)

Axel Rudakubana, 17 at the time, murdered three young girls and injured a further ten during an attack at a children’s dance class on July 29, 2024. The resulting social media posts led to violence across the UK, with some social media users imprisoned for their posts.

Amnesty said that within hours, falsehoods about the killer’s identity, religion, race, and immigration status were spread across social media, especially X, whose algorithmically determined For You page favors “contentious engagement over safety.”

The human rights organization said in a report published this week that X’s recommendation system, its content-ranking algorithm, “systematically prioritizes content that sparks outrage, provokes heated exchanges, reactions, and engagement, without adequate safeguards to prevent or mitigate harm.”

Amnesty analyzed the source code behind the algorithm, which X open sourced in 2023. It found that engagement was prioritized, and the algorithm itself had no mechanism to assess the potential for harm carried by the posts it ranked.

Examples of the posts that X, formerly Twitter, pushed to its users included those from accounts like Europe Invasions, a group known for its far-right and Islamophobic themes, which were amplified when referenced by accounts with significantly larger followings.

According to written evidence provided by Marc Owen Jones, associate professor of media analytics at Northwestern University in Qatar, to a UK parliamentary committee, Elon Musk was one of those larger accounts to have amplified the far-right messaging to his 194 million followers at the time.

Owen Jones said Musk posted 46 times during the UK riots, generating a total 808 million impressions.

“Musk’s interventions demonstrably shaped online discourse,” he wrote, referring to another instance in which he shared a post, the content of which emanated from Tommy Robinson – whose real name is Stephen Yaxley-Lennon – a figure widely associated with Islamophobic and extremist content.

Amnesty cited Yaxley-Lennon’s posts, which are estimated to have generated more than 580 million impressions, as another key driver of the racist discourse that contributed to the riots across the UK. Yaxley-Lennon was previously banned from X for hate speech, but was reinstated after Musk took control of the platform in 2022.

Prison time and legislative gaps

According to data from the National Police Chiefs’ Council, reported by ITV, 1,876 people in the UK were arrested in relation to the countrywide riots that followed Rudakubana’s killings, many over the content they posted to social media.

Of these, 1,110 have been charged at present. Some social media offenders have already been handed prison sentences of more than a year, while others, who both incited violence on social media and participated in the riots, received considerably longer sentences.

The UK government ordered various reviews and investigations into the events leading up to the riots, and those that prolonged them. Bringing participants to justice was just one part of its approach.

One of these focused on the Prevent program, which aims to identify potential terrorists and intervene before harmful attacks can be carried out.

Rudakubana was referred to Prevent three times during his teens. The review found he was prematurely discharged from the program, despite demonstrating a clear interest in previous events like the Manchester Arena bombings, and talking about stabbing people.

Politicians highlighted various shortcomings in the program and issued 14 recommendations for improvements earlier in February.

Separately, Dame Chi Onwurah, chair of the Science, Innovation and Technology Committee, said last month that the controversial Online Safety Act (OSA), which introduced rules for platforms like X to protect users from illegal content in 2023, “just isn’t up to scratch.”

The committee heard from various subject matter experts who made it aware that the OSA fails to legislate against the algorithmic amplification of harmful content.

Platforms are duty-bound to proactively address the sources of illegal or harmful content. This is why posts are often removed before reaching users’ feeds. However, the law does not cover content that is harmful but not illegal.

Members of Parliament agreed that social media platforms must be held to greater account for the content posted to them, after previously hearing from UK lawyers that the current legislation may or may not cover misinformation.

Accountability calls

Amnesty’s head of Big Tech accountability, Pat de Brún, echoed these calls in the human rights org’s report today, saying X still risks user safety a year on from the Southport attack and resulting riots.

“Our analysis shows that X’s algorithmic design and policy choices contributed to heightened risks amid a wave of anti-Muslim and anti-migrant violence observed in several locations across the UK last year, and which continues to present a serious human rights risk today,” he said.

“Without effective safeguards, the likelihood increases that inflammatory or hostile posts will gain traction in periods of heightened social tension.”

A spokesperson at X told The Register:

“We are committed to keeping X safe for all our users. Our safety teams use a combination of machine learning and human review to proactively take swift action against content and accounts that violate our rules, including our Violent Speech, Hateful Conduct and Synthetic and Manipulated Media policies, before they are able to impact the safety of our platform.

“Additionally our crowd-sourced fact checking feature Community Notes plays an important role in supporting the work of our safety teams to address potentially misleading posts across the X platform.” ®

Original Post URL: https://go.theregister.com/feed/www.theregister.com/2025/08/07/amnesty_x_uk_riots/

Category & Tags: –

Views: 2

LinkedIn
Twitter
Facebook
WhatsApp
Email

advisor pick´S post