New Rules for Online Platforms: What You Need to Know

12 Feb

The UK’s Online Safety Act 2023 is finally in motion. As of 16 December 2024, Ofcom, the UK’s online safety regulator, has released its first Illegal Harms Code, officially putting online platforms on notice.

This is a significant shift. Until now, platforms only had to comply with general criminal laws and e-commerce regulations. Under the new Code, social media, search engines, messaging apps, gaming platforms, dating sites, and even file-sharing services must actively prevent and remove illegal content.

What does this mean for online platforms?

For the first time, online service providers will be legally required to protect their users from illegal harm on their platforms. This includes:

🔹 Identifying and removing illegal content (hate speech, terrorism, cyberflashing, online grooming, intimate image abuse, fraud, and more).
🔹 Conducting risk assessments to understand potential harms on their platforms.
🔹 Improving content moderation by ensuring they have trained teams in place.
🔹 Providing clear reporting and complaint systems for users.
🔹 Making it harder for illegal content to spread by restricting visibility.
🔹 Protecting children by limiting profile visibility and access to harmful content.

Key deadlines – save the dates:

📅 16 March 2025: platforms must complete a risk assessment to identify potential harms.
📅 17 March 2025: platforms must start implementing safety measures to address those risks (subject to parliamentary approval).

Who needs to comply?

If your platform has a significant number of UK users and falls into one of these categories, the Act applies to you:

Social media platforms
Search engines
User-to-user services (where people share content)
Pornographic content providers

What does compliance look like?

Platforms must follow a structured risk assessment process:

1️⃣ Identify the types of illegal content that could appear.
2️⃣ Assess how likely users are to encounter this content.
3️⃣ Implement safety measures to prevent harm.
4️⃣ Review and update policies regularly.

Recommended safety measures

Platforms will need stronger safeguards to comply with the new rules. This includes:

🔸 Accountability: appointing a senior executive responsible for compliance.
🔸 Content Moderation: implementing proactive monitoring and swift removal of illegal content.
🔸 Reporting & Complaints: setting up a clear and accessible complaint system for users.
🔸 Transparency: clearly outlining in terms of service how illegal content is handled.
🔸 User Protections: removing accounts linked to terrorist organisations.

The specific measures required will depend on the platform’s size, risk level, and services provided.

Why this matters

This is just the beginning. Ofcom’s Illegal Harms Code is the first in a series of regulations rolling out in 2025, shaping the future of online safety in the UK.

For online service providers, the message is clear: act now. Conduct risk assessments, review your safety policies, and prepare for compliance—or risk serious penalties.

The digital landscape is changing. Is your platform ready?

Get in touch with our commercial team for more information here: