An illuminated laptop screen in a dark room. How the Online Safety Act 2023 shapes the future of online services

How the Online Safety Act 2023 shapes the future of online services

Naomi Redman, associate in the Russell-Cooke Solicitors, corporate and commercial team.
Naomi Redman
4 min Read

In the wake of the Online Safety Act 2023 (OSA), associate Naomi Redman examines the origin and implications of the legislation, emphasising the duty of care placed on online service providers. She delves into the regulatory framework led by the Office of Communications (Ofcom) and highlights the Act's impact on various online businesses, shedding light on the criteria for categorising user-to-user and search services.

Where does the Online Safety Act (OSA) come from?

Who does the Act target? 

The Online Safety Act 2023 (OSA) became law on the 26 October 2023. OSA  evolved from the Internet Safety Strategy Green paper consultation (2017) which identified key categories of harm which can occur online, including abuse, cyberbullying and trolling, as well as key areas which would support the internet becoming a safer place, particularly for children. 

What is the OSA? 

The OSA clearly establishes a duty of care to regulated providers of online services to ensure they take active steps to prevent the proliferation of harmful (and illegal) behaviours and content online. Whilst the OSA is comparable to the EU Digital Services Act (EU DSA), it is important to note that the UK’s OSA has higher threshold requirements in certain areas, for example, it requires service providers to carry out detailed risk-assessments.

How is the OSA regulated? 

The Office of Communications (Ofcom) has been appointed as the online safety regulator, meaning they have oversight of online service providers, and the ability to ensure adherence to OSA. Ofcom will be able to investigate and request information from service providers, apply to court to request a particular service is withdrawn or otherwise limited, issue fines of up to £18m, or 10% of global revenue for a company, whichever is bigger (fines must, however, be proportionate), and company directors failing to meet the required standards could face time in prison.

Ofcom’s powers under the Act are not limited to service providers based in the UK. The extra-territorial scope means that providers may be caught if they are providing services to a significant number of UK users and UK users can use their services, amongst other criteria. Ofcom are in the process of creating key codes of conduct and guidance to support online service providers in ensuring their compliance with OSA. In Particular, Ofcom highlight that people in the UK should be safer against the following core outcomes: 

  • stronger safety governance in online firms
  • online services designed and operated with safety in mind
  • choice for users so they can have meaningful control over their online experiences
  • transparency regarding the safety measures services use, and the action Ofcom is taking to improve them, in order to build trust
  • Company
    0.3-0.4%
    UK businesses potentially affected by the new Online Safety Act
  • Salary
    £18m
    Up to £18m or 10% of global revenue fined to businesses that fail to comply

Which businesses are targeted by the OSA?

According to the Governments Impact Assessment, it is expected that approximately 0.3-0.4% of all UK businesses may be affected by the new regulation (equivalent to 25,050 platforms). It is thought that Ofcom will initially be targeting a number of Big Tech companies. Whilst further clarification is awaited (as set out below), it is expected that many small-mid size companies will also be caught within the regulatory net. OSA groups online service providers into ‘user-to-user services’ and ‘search services’. 

User-to-user services 

User-to-user services are defined as any online service Where people generate and share content for others to view (such as Tik-Tok, X (formally Twitter), Instagram). 

These services include social media services, private messaging, online marketplaces, blogging platforms, file and audio-sharing services, gaming sites, amongst others. 

Regulated user-to-user service providers must ensure that they (1) identify key risks and (2) take steps to protect users from such illegal content.

Search services

Search services means any online service with a search engine function, allowing users to search multiple websites or databases for information. Search services do not include SMS or email, as these are not (currently) within the scope of OSA. 

Whilst all online service providers will have to meet general requirements under OSA, there are additional requirements for service providers falling within certain categories. The current categories will be further defined and established secondary legislation, but at present consist of the following:

Category 1: This includes user-to-user services with highest risk (for example, global social media platforms).

Category 2A: This includes the largest search engines with a wide reach.

Category 2B: This includes any services not directly within Category 1 or Category 2A, that for other reasons, may be considered high risk. 

As part of Ofcom’s timeline for implementation of OSA, they have outlined three key phases, each of which contain overlapping timelines as shown in the diagram below. 

Ofcom’s timeline for implementing the OSA

9 November 2023: Key guidance published 

  • support to services in performing risk assessments
  • draft codes of practice highlighting key steps of mitigation
  • draft enforcement guidelines

Autumn 2024: 

  • final decisions expected following consultation
December 2023: Part One
Pornography services and other related services provide responses relating to draft guidance on age assurance
This will be relevant to all services in scope of Part 5 of the Online Safety 
Spring 2024: Part Two
Other regulated services and stakeholders able to respond draft codes of practice relating to child protection 
Spring 2025
Draft guidance on protecting women and girls published, after codes of practice for protection of children is finalised

Spring 2024: (Category 1, 2A or 2B services only)  

Call for evidence on guidance 

Summer 2024: (Category 1, 2A or 2B services only) 

Consultation on draft transparency guidance 

Spring 2025: 

Publish draft proposals regarding additional duties on these services in early 2025

Summer 2025: 

Issue transparency notices

Demonstrating compliance with the OSA

The OSA itself sets out many of the obligations in general terms, and the detailed implementation of the OSA will be subject to further onlin and guidance issued by Ofcom. However, it is expected that the majority of online service providers will need to demonstrate their compliance with the rules by: 

  • conducting risk assessments to protect against illegal content
  • assessing the likelihood and risks of exposure of harmful content to children
  • explaining and demonstrating what protection measures are in place to safeguard users
  • providing a clear and user-friendly method of reporting any content that is illegal and/or harmful to children
  • balancing and showing consideration for freedom of expression and the right to privacy complying with OSA

Some businesses will also be required to enable users greater control over what content they view online. 

Why is the OSA important to you?

There are ongoing challenges with OSA, for example, trying to navigate the line between upholding online safety, whilst preserving user privacy.  Encrypted messaging applications could face challenges with complying, for example, with section 121 of OSA which means that a regulated service provider could receive a notice requiring them to use ‘accredited technology’ to identify certain categories of content. Understandably, there has been push-back from this, and we await to see exactly how Ofcom will implement this. 

How can we help?

The corporate and commercial team has expertise to support you in understanding what the implications of the OSA may be for your business (for example, producing risk assessments, content moderation, due diligence processes on advertisers and producing privacy impact assessments).

Naomi Redman is an associate in the corporate and commercial team, advising on a range of commercial matters, assisting small-mid size companies to ensure that their commercial agreements and corporate governance processes are in order.

Get in touch

If you would like to speak with a member of the team you can contact our corporate and commercial solicitors by email, by telephone on +44 (0)20 3826 7511 or complete our enquiry form below.

Briefings Corporate and commercial law The Online Safety Act 2023 OSA 2023 OSA EU Digital Services Act duty of care Office of Communications online safety regulator adherence to OSA service providers codes of conduct compliance with OSA Safer Internet Day 2024 online services user-to-user services search services social media services private messaging online marketplaces blogging platforms file and audio-sharing services gaming sites search engine function global social media platforms OSA timeline online safety codes of practice online safety guidance online safety risk assessment protect children from illegal content illegal content harmful content for children freedom of expression safeguard internet users upholding online safety user privacy section 121 of OSA Ofcom internet safety