The Online Safety Act 2023: A New Era for Digital Regulation or a Threat to Freedom of Expression in the UK?

Author: Tanmeet Sachdeva and University of Surrey

To the Point

The Online Safety Act 2023 (OSA) represents the United Kingdom’s most significant piece of digital regulation in recent years. It introduces legally binding obligations on online platforms to manage and mitigate harmful and illegal content. While the Act is framed as a response to real dangers such as child exploitation, terrorist content, and cyberbullying, its sweeping powers raise substantial concerns around freedom of expression, government censorship, and the disproportionate burden on digital service providers.

The Act creates a statutory duty of care for online platforms, including search engines and messaging apps, to identify, remove, and prevent the appearance of harmful material. It empowers Ofcom with unprecedented enforcement capabilities, including the ability to impose massive fines and initiate criminal proceedings against company executives. However, many experts, including human rights groups and legal scholars, argue that this legislation could lead to over-compliance, suppression of lawful speech, and a chilling effect across the digital sphere.

Abstract

The Online Safety Act 2023 is a landmark digital regulation law enacted in response to growing concerns about harmful online content. It imposes new duties of care on tech companies, obliging them to take action against illegal and age-inappropriate content. While the act intended to have good intentions, it has instead caused a huge debate within the legal world because many academics believe the act has limited the rights of Article 10 of the ECHR which protects freedom of expression.

This article explores the key provisions, regulatory mechanisms, and potential consequences of the OSA. It analyses whether the Act respects fundamental human rights and legal principles such as proportionality, necessity, and legality. In doing so, it refers to pertinent case law, both domestic and international, that highlights the delicate balance between online safety and civil liberties.

Ultimately, this article argues that while the Act addresses genuine societal harms, its broad scope, ambiguous terminology, and heavy-handed enforcement mechanisms could set a dangerous precedent for government regulation of speech in the digital age.

Use of Legal Jargon

This article uses several legal terms central to understanding the legal challenges raised by the OSA:

Freedom of expression (Article 10 ECHR): The right to hold and express opinions without interference by public authority.

Proportionality test: A legal principle used by courts to ensure that restrictions on rights are justified, appropriate, and the least restrictive means available.

Duties of care: Legal obligations requiring companies to take reasonable steps to prevent foreseeable harm.

Prior restraint: Government action that prohibits speech or expression before it can take place.

Overbreadth doctrine: A legal doctrine where a law is invalidated for punishing not only unprotected conduct but also constitutionally protected expression.

Ministerial override: Clauses in the Act allowing the Secretary of State to direct Ofcom’s implementation of certain codes, raising separation of powers concerns.

The Proof

The Online Safety Act 2023 received Royal Assent on 26 October 2023, becoming one of the most complex and far-reaching pieces of legislation in the UK’s digital regulation history. It is primarily designed to implement a risk-based approach to online harm, focusing on prevention, detection, and removal. Ofcom can now  create codes of practice, conduct audits, issue fines of up to £18 million or 10% of global annual turnover, and initiate criminal proceedings against senior management for non-compliance in extreme cases.

Case Laws

1. Handyside v. United Kingdom (1976)

Relevance: The OSA may compel platforms to remove speech that, while distasteful or controversial, is still protected under Article 10 ECHR.

2. R (Miller) v. College of Policing [2020] EWHC 225 (Admin)

This case ruled that police intervention over a lawful but offensive tweet constituted a disproportionate interference with freedom of expression.

Relevance: Reinforces the need to distinguish between genuinely harmful and merely offensive speech—something the OSA does not always do clearly.

3. R (Bridges) v. Chief Constable of South Wales Police [2020] EWCA Civ 1058

The Court of Appeal ruled that the use of facial recognition technology breached Article 8 ECHR due to lack of safeguards and clarity.

Relevance: This case supports the view that technology regulation must include strong legal standards, including proportionality and necessity—standards the OSA must meet.

4. Caroline v. Germany (2004)

The ECtHR stressed that even public figures have a right to private life and protection from unwarranted media exposure.

Relevance: The OSA intersects with privacy rights, particularly when companies are compelled to undermine end-to-end encryption or reveal user information.

Conclusion

The Online Safety Act 2023 seeks to respond to pressing digital-age threats—particularly the prevalence of harmful and illegal content online. It attempts to create a safer online space, especially for children and vulnerable individuals, by holding large platforms to account. However, the law’s effectiveness will ultimately depend on how Ofcom implements its regulatory functions and how the courts interpret vague or broad provisions.

There are legitimate concerns that the Act may:

·  Encourage over-moderation of lawful speech

·  Place disproportionate compliance burdens on smaller platforms

·  Undermine encryption and privacy

·  Enable government overreach through ministerial directions

In balancing online safety and civil liberties, the UK must remain vigilant to prevent mission creep, where regulations exceed their original intent. The judiciary will likely play a central role in this legal balancing act. Transparency, judicial oversight, and careful monitoring of Ofcom’s powers will be essential to maintaining the rule of law and democratic accountability in the digital domain.

FAQ

Q1: What is the Online Safety Act 2023?

The Online Safety Act is UK legislation aimed at regulating internet platforms to ensure the removal of illegal content and the protection of children from harmful content. It imposes legal obligations on tech companies and grants enforcement powers to Ofcom.

Q2: Does the Act ban freedom of speech?

No, the Act does not explicitly ban free speech. However, it may result in platforms over-censoring content to avoid penalties, which could indirectly restrict lawful expression. Critics argue this poses a chilling effect, especially for marginalised or dissenting voices.

Q3: Are private messaging apps included?

Yes. The Act includes messaging services like WhatsApp and Signal under certain conditions. This raises controversy over end-to-end encryption, as obligations to scan for harmful content may undermine user privacy and security.

Q4: How does the Act affect businesses?

All in-scope services must:

·   Carry out risk assessments

·   Implement user reporting systems

·   Moderate content in line with their published terms

·   Submit to Ofcom’s audits and investigations

Non-compliance could lead to severe financial penalties or, in extreme cases, criminal prosecution of executives.

Q5: Is judicial review possible?

Yes. Actions taken by Ofcom under the Act are subject to judicial review, ensuring regulatory decisions adhere to the principles of proportionality, legality, and fairness. Courts may strike down measures that unduly infringe on fundamental rights.

Leave a Reply

Your email address will not be published. Required fields are marked *