News & Commentary

The Online Safety Act – A Practical Introduction

Case study - The Online Safety Act – A Practical Introduction

Background

For many years, the UK has been making clear its aim for the UK to be the safest place in the world to be online. The Online Safety Act (“OSA”) is the way the Government wants to achieve that aim – dealing with illegal content, adjusting content so it is appropriate for children, having verification and blocking tools to give people more control over what they see, and so on. All of this is with the aim of increasing online safety – especially for children.

This also all has to be balanced against protecting a user’s fundamental rights – of freedom of expression and privacy, for example.

Ofcom has been appointed the regulator for this legislation.

Absolute obligations?

Before getting into detail, the OSA doesn’t require businesses to prevent all illegal content; those behind the new law recognised that it wouldn’t be realistic. The focus is on businesses taking a proportionate and risk-based approach, appropriate to the business and platform involved.

The OSA is also just a headline – there will be follow-up legislation, under the OSA banner, which will detail specific requirements, and also allow the OSA to keep up with inevitable technology changes. That all points to safety by design – pre-emptively protecting against content being added to a platform, rather than building protection only around retro-actively moderating content and taking it down.

Who is in scope?

The OSA targets three types of services:

  • User-to-user services i.e. platforms that allow users to generate, upload or share content for other users to see. Think Facebook or Twitter, online marketplaces, or dating apps.
  • Search services i.e. a search engine or service that allows multiple database and website searches. Google is the prime example.
  • Services publishing or displaying certain kinds of pornographic content.

These overall banners will also be broken down into a further three high-risk categories when guidance is released, dealing with the services that carry the highest risk and reach. The categories will likely depend on user-scope and size, what each platform can do, and the potential for harm from that service. A rough Government estimate is that there will be 30 to 40 of those high-risk businesses/platforms, whilst everything else will fall outside of these categories.

The OSA only applies to platforms with a UK interest –UK users, the UK being a target market, or the platform being a risk to UK users; so the OSA will also apply to businesses outside the UK.

There are some limited exemptions (such as news publishers), with more detail to be released in the guidance.

Immediate considerations and action

Businesses will be expected to assess risk by undertaking illegal harms risk assessments. Ofcom will release guidance on this later in 2024, and businesses will have three months to complete their first assessments once the guidance has been issued. However, we do know there will be a four-step process to undertake an assessment:

  • Understand the harms.
  • Assess the risk of harm.
  • Decide on measures to prevent the harm, implement them and record them.
  • Review, report and update the protections in place.

In performing the assessment, businesses need to take into account things like the demographic of the user-base, functionality of the platform, algorithmic systems, reactions to use of the platform, and testing.

Bottom line – what does your platform need to be able to do when the time comes

More detail will be coming on this in the next few months, but in-scope businesses will need to make sure of the following:

  • Illegal content is not available to users
  • If illegal content is live, it is not live for very long – it is taken down quickly
  • Have easy reporting mechanisms for users to report illegal and harmful content
  • Have easy-to-use and responsive complaints procedures
  • Perform an access assessment with a particular focus on children
  • Prevent access for children to pornographic content
  • Have appropriate systems to prevent fraudulent advertising
  • Have terms of service that are accessible and clear, and enforce them where appropriate (for high-risk categories)
  • Have adult identity verification processes (for high-risk categories)

There may be fees to pay to Ofcom to fund the OSA regime (in a similar way to businesses paying a small annual fee to the ICO in order to be compliant with UK GDPR).

What is the risk of not complying?

Ofcom, as the regulator, has investigatory powers. It can also impose sanctions on responsible managers. However, the bigger business concern is that Ofcom can issue fines of up to £18 million or 10% of global annual revenue, whichever is greater.

What to do now?

To start with, know your platform. Do an initial assessment to see if the OSA applies to your business – or not; and, if the OSA does apply, consider whether the business might be seen as in a high-risk category.

Based on that assessment, review the functionality, processes, systems and policies on the basis of the available OSA guidance, and check the following in respect of the platform in question:

  • Is there a risk of illegal harms
  • Will children have access
  • Are any safeguards already in place
  • Are there reporting and complaints processes
  • Are the terms of service up-to-date, and do they accurately reflect functionality

As we said above, once the initial guidance comes out, timelines will be tight – so it is important to get a headstart now.

For more information, contact Simon Weinberg or Deborah Tastiel from Asserson’s commercial team.

Leave a Comment

Your email address will not be published.