Advanced Television

Ofcom publishes industry guidance on effective age checks

January 16, 2025

Children will be prevented from encountering adult content and protected from other types of harmful content under Ofcom’s new industry guidance which sets out how it expects sites and apps to introduce effective age assurance.

The decisions are the next step in Ofcom implementing the Online Safety Act and creating a safer life online for people in the UK, particularly children. It follows tough industry standards, announced in December, to tackle illegal content online, and comes ahead of broader protection of children measures which will launch in the spring.

Robust age checks are a cornerstone of the Online Safety Act. It requires services which allow pornography or certain other types of harmful content to introduce ‘age assurance’ to ensure that children are not normally able to encounter it. Age assurance methods – which include age verification, age estimation or a combination of both – must be ‘highly effective’ at correctly determining whether a particular user is a child.

Ofcom has now published industry guidance on how it expects age assurance to be implemented in practice for it to be considered highly effective. Ofcom said: “Our approach is designed to be flexible, tech-neutral and future-proof. It also allows space for innovation in age assurance, which represents an important part of a wider safety tech sector where the UK is a global leader. We expect the approach to be applied consistently across all parts of the online safety regime over time.”

The approach also takes care to ensure that privacy rights are protected and that adults can still access legal pornography. As platforms take action to introduce age assurance over the next six months, adults will start to notice changes in how they access certain online services.

What are online services required to do, and by when?

The Online Safety Act divides online services into different categories with distinct routes to implement age checks. However, the action Ofcom expects all of them to take starts immediately:

  • Requirement to carry out a children’s access assessment.  All user-to-user and search services in scope of the Act, must carry out a children’s access assessment to establish if their service – or part of their service – is likely to be accessed by children. These services have three months to complete their children’s access assessments, in line with our guidance, with a final deadline of April 16th. Unless they are already using highly effective age assurance and can evidence this, Ofcom anticipates that most of these services will need to conclude that they are likely to be accessed by children within the meaning of the Act. Services that fall into this category must comply with the children’s risk assessment duties and the children’s safety duties.
  • Measures to protect children on social media and other user-to-user services. Ofcom will publish its Protection of Children Codes and children’s risk assessment guidance in April. This means that services that are likely to be accessed by children will need to conduct a children’s risk assessment by July – that is, within three months. Following this, they will need to implement measures to protect children on their services, in line with our Protection of Children Codes to address the risks of harm identified. These measures may include introducing age checks to determine which of their users are under-18 and protect them from harmful content.
  • Services that allow pornography must introduce processes to check the age of users: all services which allow pornography must have highly effective age assurance processes in place by July at the latest to protect children from encountering it. The Act imposes different deadlines on different types of providers. Services that publish their own pornographic content including certain Generative AI tools, must begin taking steps immediately to introduce robust age checks, in line with published guidance. Services that allow user-generated pornographic content must have fully implemented age checks by July.

What does highly effective age assurance mean?

Ofcom’s approach to highly effective age assurance and how it expects it to be implemented in practice applies consistently across three pieces of industry guidance. In summary:

  • confirms that any age-checking methods deployed by services must be technically accurate, robust, reliable and fair in order to be considered highly effective;
  • sets out a non-exhaustive list of methods. They include: open banking, photo ID matching, facial age estimation, mobile network operator age checks, credit card checks, digital identity services and email-based age estimation;
  • confirms that methods including self-declaration of age and online payments which don’t require a person to be 18 are not highly effective;
  • stipulates that pornographic content must not be visible to users before, or during, the process of completing an age check. Nor should services host or permit content that directs or encourages users to attempt to circumvent an age assurance process; and
  • sets expectations that sites and apps consider the interests of all users when implementing age assurance – affording strong protection to children, while taking care that privacy rights are respected and adults can still access legal pornography.

Melanie Dawes, Ofcom’s Chief Executive, commented: “For too long, many online services which allow porn and other harmful material have ignored the fact that children are accessing their services. Either they don’t ask or, when they do, the checks are minimal and easy to avoid. That means companies have effectively been treating all users as if they’re adults, leaving children potentially exposed to porn and other types of harmful content. Today, this starts to change.”

As age checks start to roll out in the coming months, adults will start to notice a difference in how they access certain online services. Services which host their own pornography must start to introduce age checks immediately, while other user-to-user services – including social media – which allow pornography and certain other types of content harmful to children will have to follow suit by July at the latest. We’ll be monitoring the response from industry closely. Those companies that fail to meet these new requirements can expect to face enforcement action from Ofcom,” she concluded.

Categories: Articles, Policy, Regulation

Tags: ,