The Online Safety Bill is to be amended and measures that would have forced social media sites to take down material designated “legal but harmful” will, if the amendments are accepted, be removed. Content that is harmful to adults is defined as content with “a material risk of having, or indirectly having, a significant adverse physical or psychological impact”. Examples of “legal but harmful” content include misogyny and the glorification of eating disorders.

The “legal but harmful” clause drew criticism from free speech campaigners, who claimed that governments or tech platforms could use the bill to censor certain content. Now the key requirements of the bill are being redefined.

Now, instead of the “legal but harmful” duties, it is proposed there will be a greater requirement for firms to provide adults with tools to hide certain content they do not wish to see, including types of content that do not meet the criminal threshold but could be harmful to see.

The Government is calling this approach a “triple shield” of online protection which also allows for freedom of speech. Under the bill, social media companies could also face being fined by Ofcom (the new regulator for the tech sector) up to 10% of annual turnover if they fail to fulfil policies to tackle racist or homophobic content on their platforms.

Updates to strengthen accountability and transparency will also be introduced to boost child online safety which will require tech firms to:

  • publish summaries of risk assessments in regard to potential harm to children on their sites 
  • show how they enforce user age limits
  • and publish details of enforcement action taken against them by Ofcom

In addition, the Victims' Commissioner, Domestic Abuse Commissioner and Children’s Commissioner will be added as statutory consultees to the bill, meaning that Ofcom must consult them when drafting new codes of conduct which tech firms must follow in order to comply with the bill.

Children’s Commissioner for England, Dame Rachel de Souza, said this would ensure “children’s views and experiences are fully understood”.

Julie Bentley, chief executive of Samaritans, described dropping the requirement to remove “legal but harmful” content as “a hugely backward step". She said, "Of course, children should have the strongest protection but the damaging impact that this type of content has doesn’t end on your 18th birthday. Increasing the controls that people have is no replacement for holding sites to account through the law and this feels very much like the Government snatching defeat from the jaws of victory.”

Shadow culture secretary Lucy Powell said, “Replacing the prevention of harm with an emphasis on free speech undermines the very purpose of this Bill, and will embolden abusers, Covid deniers, hoaxers, who will feel encouraged to thrive online.”

The latest changes come in the wake of other updates to the bill, including criminalising the encouragement of self-harm and of “downblousing” and the sharing of pornographic deepfakes (see previous blog).

The Online Safety Bill is due to return to Parliament in the week commencing 5 December 2022.