SOCIAL media platforms are facing pressure from First Minister John Swinney and Ofcom over the spread of hatred on their sites amid a wave of far-right violence in England and Northern Ireland.

In a letter published on its website, Ofcom called on platforms like Twitter/X and Facebook to “act now” to clamp down on hateful disinformation that is feeding race riots.

First Minister Swinney also called on social media companies “to get their act together” as Scottish communities face fears of being targeted by the far-right.

The SNP leader went on: “We've been discussing with the United Kingdom government the importance of pressure being placed on social media companies.

“The way in which social media has been used to create community unease is terrible, and it’s causing fear and alarm among communities, some very vulnerable people in our society.”

Sabir Zazai, the chief executive of the Scottish Refugee Council, was among the community leaders to meet with Swinney in Bute House on Wednesday.

Speaking afterwards, Zazai said some people in communities he worked with were scared to go out of their homes. 

“We've had other people worried about their wellbeing, people not being able to travel to their appointments with their lawyers or getting advice from us and others," he told the BBC.

“This should all be avoided and that shouldn't happen, people should not feel that insecurity and fear in our streets."

Scottish Refugee Council chief executive Sabir Zazai

Gill Whitehead, Ofcom’s group director for online safety, wrote: “In the context of recent acts of violence in the UK, Ofcom has been engaging with various online services to discuss their actions to prevent their platforms from being used to stir up hatred, provoke violence and commit other offences under UK law.

“Under Ofcom’s regulations that pre-date the Online Safety Act, UK-based video-sharing platforms must protect their users from videos likely to incite violence or hatred. We therefore expect video-sharing platforms to ensure their systems and processes are effective in anticipating and responding to the potential spread of harmful video material stemming from the recent events.”

Whitehead went on: “Additionally, as you will be aware, the Online Safety Act sets out new responsibilities for online services around how they assess and mitigate the risks of illegal activity, which can include content involving hatred, disorder, provoking violence or certain instances of disinformation.

READ MORE: Ofcom approves BBC Scotland plan to cut TV news output in half

“When we publish our final codes of practice and guidance, later this year, regulated services will have three months to assess the risk of illegal content on their platforms, and will then be required to take appropriate steps to stop it appearing, and act quickly to remove it when they become aware of it.

“Some of the most widely-used online sites and apps will in due course need to go even further – by consistently applying their terms of service, which often include banning things like hate speech, inciting violence, and harmful disinformation.”

Whitehead said that although the new safety duties under the Online Safety Act will not be in place for a few months, social media firms “can act now – there is no need to wait to make your sites and apps safer for users”.

How will the Online Safety Act change things?

The new laws will, for the first time, make firms legally responsible for keeping users, and in particular children, safe when they use their services.

Overseen by Ofcom, the new laws will not specifically focus on the regulator removing pieces of content itself, but it will require platforms to put in place clear and proportionate safety measures to prevent illegal and other harmful content from appearing and spreading on their sites.

Crucially, clear penalties will be in place for those who do not comply with the rules.

Ofcom will have the power to fine companies up to £18 million or 10% of their global revenue, whichever is greater – meaning potentially billions of pounds for the largest platforms.

In more severe cases, Ofcom will be able to seek a court order imposing business disruption measures, which could include forcing internet service providers to limit access to the platform in question.

And most strikingly, senior managers can be held criminally liable for failing to comply with Ofcom in some instances.

The intervention from the media watchdog comes as parts of England and Northern Ireland are steeled for as many as 100 events connected with far-right disorder on Wednesday.

Riot police form a line in Ulster (Image: Peter Morrison/PA Wire)

Twenty people were charged overnight, the Crown Prosecution Service said, bringing the total to more than 140, as police chiefs continued to warn rioters they could “expect a knock at the door”.

The National Police Chiefs’ Council (NPCC) said they expected that number to rise “significantly” in the coming days, as officers looked to make further arrests.

Police are understood to be preparing to respond to more than 100 planned protests and potentially around 30 more counter protests on Wednesday, with gatherings anticipated in 41 of the 43 police force areas in England and Wales.

A police source said: “Today is probably going to be the busiest day of the week, into the evening.

“Tonight, we think it’s looking like a credible picture. We are preparing for activity across 41 forces.”