close
close

The true story of the website accused of fuelling the Southport riots

The true story of the website accused of fuelling the Southport riots

Getty Images: Police officers in riot gear in Sunderland, seen from behind close to the camera, while an overturned car burns in the middle ground and a crowd crowds around the car.Getty Images

Police officers stand in front of a burning car in Sunderland, one of many cities hit by riots.

What connects a father from Lahore, Pakistan, an amateur hockey player from Nova Scotia and a man named Kevin from Houston, Texas?

They all link to Channel3Now – a website whose story of giving a false name for the 17-year-old charged over the Southport attack was widely cited in viral posts on X. Channel3Now also falsely claimed the attacker was an asylum seeker who arrived in the UK by boat last year.

This, along with false claims from other sources that the attacker was a Muslim, are widely seen as the trigger for riots across Britain, some of which targeted mosques and Muslim communities.

The BBC has tracked down several people with links to Channel3Now, spoken to their friends and colleagues who confirmed they are real people, and interviewed a person claiming to be the site’s “management”.

What I found appears to be a commercial company trying to aggregate crime news while making money on social media. I have found no evidence to support the claim that Channel3Now’s disinformation could be linked to the Russian state.

The person, who claimed to be from Channel3Now management, told me that publishing the wrong name “shouldn’t have happened, but it was a mistake and not intentional.”

The author of the fake article was not named and it is unclear who exactly wrote it.

———

An amateur hockey player from Nova Scotia named James is the first person I associate with Channel3Now. His name appears as a rare byline on the website in another article, and a picture of him appears on a related LinkedIn page.

A Facebook account linked to James has only four friends, one of whom is named Farhan. His Facebook profile says he is a journalist for the site.

I write to dozens of their followers. A social media account for the school where James played hockey and one of his friends confirm to me that he is a real person who graduated four years ago. When I reach out, his friend says James wants to know “what his involvement in the article might mean.” After I respond, there is no denying that James is affiliated with the site – and his friend stops responding.

Former colleagues of Farhan, some of whom live in Pakistan, confirm his identity. He posts about his Islamic faith and his children on his social media profiles. His name does not appear in the false article.

Soon after I sent a message, Farhan blocked me on Instagram, but I finally received a response via Channel3Now’s official email.

An archived screenshot of a Channel3Now story in which the Southport attacker is named under a false name, falsely claimed to be an asylum seeker and falsely claimed to be on "an MI6 watch list"

Channel3Now later apologised for misnaming the Southport attacker

The person who comes forward says his name is Kevin and he lives in Houston, Texas. He declines to give his last name and it’s unclear if Kevin is actually who he says he is, but he agrees to answer questions via email.

Kevin says he’s speaking to me from the site’s “headquarters” in the U.S. – which matches both the timing of social media posts on some of the site’s social media profiles and the times Kevin responds to my emails.

He first calls himself “editor-in-chief” before telling me that he is actually the “verification producer.” He does not want to reveal the name of the website owner because he “cares not only about himself, but also about everyone who works for him,” according to the author.

Kevin claims there are “more than 30” people working for the site in the US, UK, Pakistan and India who are usually recruited from freelance sites – including Farhan and James. He says Farhan in particular was not involved in the false Southport story, for which the site has publicly apologized, and puts the blame on “our UK-based team.”

BBC Sounds banner
Orange strap line

Following Channel3Now’s false claims, the broadcaster was accused of having ties to the Russian state based on old videos on its Russian-language YouTube channel.

Kevin says the site bought a former Russian-language YouTube channel focused on car rallies “many years ago” and later changed its name.

The account did not post any videos for about six years before it started uploading content related to Pakistan – where Farhan lives and where writers are also employed, according to the website.

“Just because we bought a YouTube channel from a Russian seller doesn’t mean we have any connections,” says Kevin.

“We are an independent digital news website covering news from around the world.”

It is possible to buy and repurpose a channel that has already been monetized by YouTube. This can be a quick way to build an audience so the account can start making money right away.

“As many stories as possible”

While I have found no evidence to support these claims of Russian ties to Channel3Now, pro-Kremlin Telegram channels have been reposting and amplifying the site’s false posts, a tactic they frequently use.

Kevin said the site is a commercial venture and “covering as many stories as possible” helps it generate revenue. Most of the stories are true – they appear to be based on reliable sources about shootings and car crashes in the US. However, the site has spread other false speculation about the Southport shooter and also about the person who tried to assassinate Donald Trump.

Following the false Southport story and media coverage of Channel3Now, Kevin says the YouTube channel and almost all of his “multiple Facebook pages” have been suspended, but not his X accounts. A Facebook page called Daily Felon, which exclusively shares content from the site, also remains active.

Kevin says the blame for the social media storm surrounding the Southport suspect and the riots that followed cannot be placed solely on a “small Twitter account” that made “one mistake”.

To an extent, he is right. Channel3Now’s false story was cited as a source on many social media outlets, causing the false allegations to go viral.

Some of them were based in the UK and the US and have already published disinformation on topics such as the pandemic, vaccines and climate change. These profiles were able to build a significant following and spread their content to more people by Changes Elon Musk made after buying Twitter.

Reuters A man in a black hoodie, his face covered with sunglasses and a balaclava, throws a stick while overturned trash cans, a fire and a large crowd can be seen in the backgroundReuters

More than 400 people were arrested during the riots

A profile – belonging to a woman named Bernadette Spofforth – was accused of making the first post under the Southport attacker’s fake name. She denied being the source of the post and said she had seen the name online in another post that has since been deleted.

​​In a phone conversation with the BBC, she said she was “horrified” by the attack but immediately deleted her post when she realised it was false. She said her aim was not to make money from her account.

“Why on earth would I make something like this up? I have nothing to gain and everything to lose,” she said, condemning the recent violence.

Ms Spofforth had previously made posts raising questions about lockdown and net zero climate action, but her profile was temporarily deleted by Twitter back in 2021 after she was accused of spreading misinformation about the Covid-19 vaccine and the pandemic. She denied the claims and said she believed Covid was real.

Since Musk took power, her posts have regularly been viewed over a million times.

The false claim Ms Spofforth made about the Southport attacker was quickly shared and picked up by a loose group of influential conspiracy theorists and profiles who have a history of promoting anti-immigration and far-right views.

Many of them have purchased blue ticks, giving their posts greater visibility since Musk took over Twitter.

Another change to X made by Mr Musk means that promoting these ideas can be profitable, both for conspiracy theory accounts and for commercially focused accounts like Channel3Now.

Millions of views

Some profiles like this one have received millions of views in the past week by reporting on the Southport attacks and the riots that followed. X’s “ad revenue share” means that users with a blue tick can earn a share of the revenue from the ads in their replies.

Estimates from users with fewer than half a million followers who have generated revenue this way suggest that accounts can earn $10 to $20 per million views or impressions on X. Some of these accounts spreading disinformation get more than a million impressions on almost every post and share multiple posts a day.

Other social media companies aside from X also allow their users to monetize views. But YouTube, TikTok, Instagram and Facebook have previously taken down or blocked some profiles that post content that violates their misinformation policies. Aside from rules against fake AI content, X has no misinformation policies.

While politicians are calling on social media companies to do more in the wake of the unrest, the UK’s recently passed Online Safety Bill does not currently include any legal provisions against disinformation due to concerns that it could restrict freedom of expression.

Additionally, while researching Channel3Now’s authors, I discovered that the people posting misinformation are often based overseas, making it much more difficult to take action against them.

Instead, the power to deal with this type of content currently lies with the social media companies themselves. X did not respond to the BBC’s request for comment.

Leave a Reply

Your email address will not be published. Required fields are marked *