With the rise of social media comes the need to legislate to stay up to date with possible risks and to protect individuals who use the services provided by social media companies.

The Online Safety Act 2023 brought in a range of measures with the aim of improving online safety in the UK. Criminal offences introduced by the Act include encouraging or assisting serious self-harm, cyberflashing, and intimate image abuse.

The Act’s provisions were intended to hold tech companies accountable for the content on their sites, however, they were largely applicable to individuals sending threatening messages.

The Online Safety Act 2023 (Priority Offences) (Amendment) Regulations 2024 represents a serious crackdown on intimate image abuse, whereby sharing intimate images without consent will be made a ‘priority offence’ and social media companies will have to take proactive steps in removing and preventing such material from appearing on their platforms.

In speaking about intimate image abuse online, Technology Secretary Peter Kyle outlined that: “As well as being devastating for victims these crimes have also contributed to the creation of a misogynistic culture on social media that can spread into potentially dangerous relationships offline. We must tackle these crimes from every angle, including their origins online, ensuring tech companies step up and play their part.”

This is why social media firms will face extra legal obligations which are, as Peter Kyle outlines further, “backed up by big fines”. To put an approximate number on this, companies which fail to comply with their duties may be imposed a fine of up to 10% of their qualifying worldwide revenue.

The pressure on social media companies will grow quickly from spring next year, when the protections come into force, as intimate image offences will be treated as priority offences under the Act, putting them on the same footing as the sale of weapons and drugs online.

Victims minister Alex Davies-Jones said that the amendments send “a clear message to those companies who turn a blind eye to such heinous content on their platforms – remove it without delay or face the full force of the law”. This indicates the force with which the amendments will be applied and is a reminder that negligently failing to acknowledge the existence of such content on a social media platform will not be tolerated.

This amendment should protect many thousands of social media users, particularly women and girls, from the suffering they would experience from this type of offence, though it is foreseeable that large social media companies will have to put much of their focus into assessing their algorithms and systems in order to protect these victims and avoid facing fines which would result in a large loss of revenue from their business.

As ever the lines between personal responsibility and possible vicarious liability resting with an employer/controlling organisation may, in due course, come to be considered where there has been sharing of intimate images, in the same way compensation for sexual abuse has become the responsibility of organisations which would never have condoned such behaviour. Organisations should ensure that they have in place appropriate safeguarding policies which include appropriate consideration of social media use, relationships and behaviours.