Rethinking Online Safety: Children, Pornography, and the Online Safety Act 2023

Abstract
This article explores how the UK’s Online Safety Act 2023 came into being, with a particular focus on its measures designed to prevent children from viewing online pornography. It reviews the role of Ofcom, the media regulator, in enforcing the Act, and raises concerns about how the legislation prioritizes shielding children from pornography over protecting them from potentially more harmful content—such as materials that promote suicide, eating disorders, cyberbullying, and disinformation. Ultimately, the article argues that this narrow focus may neglect the broader digital rights of children as outlined in the UN Convention on the Rights of the Child, potentially undermining the benefits that the online world can offer.


Introduction

On October 26, 2023, the UK government unveiled a piece of legislation it described as “world-leading”: the Online Safety Act 2023. Spearheaded by the Department for Science, Innovation and Technology, the Home Office, and the Ministry of Justice, the Act was presented as a way to make the internet safer for both children and adults. A major area of concern? Online pornography and its alleged harms to young users.

This article critically examines how the Act addresses the issue of online pornography and its effects on children. It questions whether the focus on this type of content is truly in line with children’s best interests, especially when other types of dangerous material receive comparatively less attention. These include content promoting suicide, self-harm, bullying, and misinformation. The article also considers how the Act aligns—or fails to align—with international child rights frameworks, especially the UN Convention on the Rights of the Child (UNCRC), to which the UK is a signatory.


The Background: From Online Harms to Legislation

The roots of the Online Safety Act can be traced back to the Online Harms White Paper published in April 2019. This document laid out a vision for tackling the growing concerns over harmful content on the internet. The White Paper emphasized that the UK should take the lead in developing a regulatory model that would hold tech companies accountable for keeping users—especially children—safe.

The proposed framework included a new statutory “duty of care,” requiring internet platforms to protect users from harmful material. Significantly, the legislation would not only address illegal content but also legal material considered harmful. The wide range of online harms listed in the White Paper was divided into three categories:

  1. Clearly Defined Harms (e.g. child sexual exploitation, terrorism, revenge porn, hate crimes, cyberstalking, extreme pornography, and sexting by minors).
  2. Less Clearly Defined Harms (e.g. cyberbullying, extremism, coercion, disinformation, and promotion of self-harm).
  3. Legal but Age-Inappropriate Content (e.g. children accessing pornography, using social media below age thresholds, or spending excessive time online).

To enforce this framework, an independent regulator—eventually identified as Ofcom—was given the power to set out codes of practice, monitor compliance, issue fines, and even hold senior executives accountable for violations.


The Emphasis on Pornography: A Critical View

While the Act addresses a broad range of harms, this article argues that official discourse and implementation plans place disproportionate emphasis on protecting children from pornography. The Act’s framing suggests that viewing pornographic material poses one of the most significant online risks to children, potentially overshadowing other, arguably more damaging content.

Examples of other harms—like cyberbullying, promotion of suicide or disordered eating, and misinformation—pose serious risks to children’s mental health, self-esteem, and safety. However, these do not appear to carry the same regulatory weight or public attention as pornography under the new law.

This narrow focus raises questions about the rationale behind the Act’s priorities. Is the concern truly about children’s well-being, or is it shaped by cultural anxieties and moral panic about sexuality and youth?


Ofcom’s Role in Enforcement

As the regulator, Ofcom is tasked with enforcing the Online Safety Act in the digital ecosystem. It will develop and publish codes of practice to guide online platforms in complying with the law. Ofcom has also been granted the authority to conduct investigations, impose hefty fines, and even pursue criminal penalties against non-compliant companies.

The Act requires platforms to implement robust age verification systems to prevent children from accessing pornographic content. However, critics have pointed out that such systems could undermine user privacy and freedom of expression, especially when improperly managed. For example, requiring users to submit personal identification data may deter adults from accessing legal content or expressing themselves freely online.


Children’s Rights and the UNCRC

The Online Safety Act positions itself as a child protection tool, but does it live up to the standards of the UNCRC? This international treaty emphasizes not only children’s right to be protected from harm, but also their rights to freedom of expression, privacy, access to information, and participation in cultural and social life.

By focusing heavily on shielding children from the dangers of pornography, the Act may unintentionally limit their access to positive online experiences. For instance, online platforms can offer educational content, peer support, and creative expression—all crucial aspects of modern childhood.

Moreover, by emphasizing safety in isolation, the Act fails to take a holistic view of children’s online lives. Effective online protection should not come at the expense of digital literacy, empowerment, or meaningful participation in the digital world.


A Selective Approach to Child Safety?

Another criticism of the Act is its seemingly inconsistent concern for children’s well-being across different government policies. While the Online Safety Act zeroes in on digital threats like pornography, other areas of public policy—such as education funding, child mental health services, or poverty—may reflect less urgency or investment.

This selectiveness suggests a politically convenient narrative, where controlling what children see online becomes a proxy for broader social and moral anxieties, rather than a consistent commitment to children’s welfare.


Conclusion

The Online Safety Act 2023 marks a significant step in the UK’s attempt to regulate the digital space. Its intention to protect children from harm is important and commendable. However, the Act’s disproportionate emphasis on shielding young people from pornography risks overlooking more urgent online threats and marginalizing children’s broader rights.

Rather than focusing solely on what children should be protected from, future digital policies should also consider what children deserve access to—supportive communities, quality information, opportunities for creativity, and meaningful participation in online life. Only then can legislation truly reflect the spirit of the UN Convention on the Rights of the Child.

Leave a Reply

Your email address will not be published. Required fields are marked *