News & Commentary

Taming the Wild West? The Online Safety Bill

Case study - Taming the Wild West? The Online Safety Bill

Which billionaire recently tweeted the below?

“This is a battle for the future of civilization. If free speech is lost even in America, tyranny is all that lies ahead.”

No prizes for guessing the answer – Elon Musk. For those who are unaware, Musk, following his acquisition of Twitter, promised to end Twitter’s ‘free speech suppression’ and stated that only tweets which constitute ‘sticks and stones’, ie threats of violence, which are generally illegal, should be removed from the platform. Opponents of Musk’s proposals argue that he is leaving the door open to hate, harassment and ‘dangerous language’, exposing vulnerable communities to harm as a result.

We may think that this debate is limited to keyboard warriors and the Twitterati but the truth is that the debate is rather closer to home. The Online Safety Bill (OSB), a ‘world-leading’ innovative bill to its backers and the censor’s charter to its critics, is currently working its way through the UK Parliament with promises from the last Culture Secretary that the OSB will ‘force social media companies to take responsibility for the toxic abuse that floods their platforms.’

What is the Online Safety Bill?

With the latest draft coming in at 228 pages, 12 parts and 197 sections, the OSB is a heavyweight piece of legislation, targeting social media, messaging and search platforms. It aims to combat illegal and harmful online content and implement further measures to protect children. How so?

At present, there is a legal distinction between publishers, such as newspapers, that create content and can therefore be held accountable for their content, and online platforms that merely host user-generated content, such as Twitter and Instagram. The latter is currently under no obligation to remove anything other than illegal content, such as terrorist material, or content that infringes someone else’s intellectual property rights.

The OSB will introduce a new statutory duty of care on service providers to “take or use proportionate measures to effectively mitigate and manage the risks of harm to individuals”. This will take the form of a ‘triple shield’ whereby platforms must: 1) remove illegal material; 2) remove material that violates their terms and conditions; and 3) allow users greater choice to limit the type of material they are exposed to. Ofcom will be tasked with overseeing a platform’s compliance with the OSB.

‘Legal but Harmful’

Previous drafts of the OSB also included requirements to moderate ‘legal but harmful’ content. These provisions were heavily criticised. ‘Legal but harmful’ is vague and subjective , and the term ‘harm’ is in fact defined circularly in the OSB as “physical or psychological harm”. Such a loosely defined obligation could lead to widespread censorship by platforms to avoid liability, thereby significantly chilling free speech.

Accordingly, on 28 November 2022 the government confirmed that the ‘legal but harmful’ provisions had been removed (although similar measures remain in place to protect children), and replaced by the triple shield described above.

What Steps will Tech Platforms Have to Take?

Whilst the removal of ‘legal but harmful’ offers some consolation to platforms, the OSB will nevertheless impose significant requirements.

First, the third aspect of the triple shield will require platforms to introduce new digital tools giving users much wider choice as to the type of content they see. This will also require greater use of algorithms to categorise uploaded content.

Next, the obligation to increase moderation means platforms will need more moderators, an area in which many platforms are already understaffed.

In all likelihood however, the immense quantity of content uploaded (e.g. 500,000 comments are made on Facebook every minute) means that tens of millions of moderators would be needed. Accordingly, platforms will have to invest more in algorithms to evaluate whether content is illegal or potentially harmful towards children. Of course the vagueness and subjectivity of the term ‘harmful’ and the inherent inaccuracy of algorithms means that platforms will likely err on the side of caution and either ‘age-gate’ much of their content or program algorithms to simply cut out huge swathes of content for all. In fact this ‘shoot first, ask questions later, is already employed by Youtube in relation to IP infringement; Youtube use a code that removes any video which is remotely associated with copyrighted content, irrespective of whether there is an actual IP infringement.

The OSB will also create a headache for end-to-end encrypted messaging services such as Whatsapp. These services will be obliged to scan private messages for illegal and harmful content, requiring the platform to weaken the encryption and access to users’ private messages.

It is also important to note that non-compliance is an unwise option as the OSB cannot be accused of lacking teeth. Platforms that fail to comply with the OSB could face criminal sanctions, massive regulatory fines of up to 10% of revenue, and even the blocking of their sites in the UK.

What Next?

The OSB faces some serious criticisms. Some say it will curb free speech, stifle innovation, undermine encryption and data privacy, and impose huge moderation costs which only the tech giants can bear, thereby acting as a market barrier to smaller firms, reducing competition and consumer choice. Furthermore, many question why the government is taking on the job of protecting children from harmful content, rather than leaving it to their parents.

On the other hand, the Samaritans, a charity that provides support for those vulnerable to suicide, has criticised the government’s removal of the ‘legal but harmful’ provisions as failing vulnerable adults and children in the midst of a mental health crisis.

Is social media a Wild West in need of taming by the OSB or a free market of speech and expression, as envisioned by the internet’s early founders? Will we eventually see a heavyweight clash between the OSB and platforms like Twitter which prioritise free speech?

Putting the moral debate aside, the OSB is a significant piece of regulation and platforms large and small must begin to prepare or face the consequences.

At Asserson, our data privacy and technology team can help your business prepare for the OSB and advise on the measures required to comply and avoid the sanctions. For more information, please be in touch at

Leave a Comment

Your email address will not be published.