Ofcom: Adult sites failing to protect children
October 20, 2022
By Colin Mann
Smaller adult video-sharing sites based in the UK do not have sufficiently robust access control measures in place to stop children accessing pornography, regulator Ofcom has found in a new report.
Ahead of its future duties in the Online Safety Bill, Ofcom already has some powers to regulate video-sharing platforms (VSPs) established in the UK, which are required by law to take measures to protect people using their sites and apps from harmful videos.
Nineteen companies have notified Ofcom that they fall within its jurisdiction. They include TikTok, Snapchat, Twitch, Vimeo, OnlyFans and BitChute, as well as several smaller platforms, including adult sites.
Ofcom has used its powers to gather information from platforms on what they are doing to protect users from harm online. Ofcom is one of the first regulators in Europe to do this – the report is the first of its kind under these laws and reveals information previously unpublished by these companies.
Ofcom is concerned that smaller UK-based adult sites do not have robust measures in place to prevent children accessing pornography. They all have age verification measures in place when users sign up to post content. However, users can generally access adult content just by self-declaring that they are over 18.
One smaller adult platform told Ofcom that it had considered implementing age verification, but had decided not to as it would reduce the profitability of the business.
However, the largest UK-based site with adult content, OnlyFans, has responded to regulation by adopting age verification for all new UK subscribers, using third-party tools provided by Yoti and Ondato.
According to new research published by Ofcom, most people (81 per cent) do not mind proving their age online in general, with a majority (78 per cent) expecting to have to do so for certain online activities. A similar proportion (80 per cent) feel Internet users should be required to verify their age when accessing pornography online, especially on dedicated adult sites.
Over the next year, adult sites that Ofcom already regulates must have in place a clear roadmap to implementing robust age verification measures. If they don’t, they could face enforcement action. Under future online safety laws, Ofcom will have broader powers to ensure that many more services are protecting children from adult content.
Ofcom has seen some companies make positive changes more broadly to protect users from harmful content online, including as a direct result of being regulated under the existing laws. For example:
- TikTok now categorises content that may be unsuitable for younger users, to prevent them from viewing it. It has also established an Online Safety Oversight Committee, which provides executive oversight of content and safety compliance specifically within the UK and EU.
- Snapchat recently launched a parental control feature, Family Center, which allows parents and guardians to view a list of their child’s conversations without seeing the content of the message.
- Vimeo now allows only material rated ‘all audiences’ to be visible to users without an account. Content rated ‘mature’ or ‘unrated’ is now automatically put behind the login screen.
- BitChute has updated its terms and conditions and increased the number of people overseeing and – if necessary – removing content.
However, according to Ofcom, it is clear that many platforms are not sufficiently equipped, prepared and resourced for regulation. It has recently opened a formal investigation into one firm, Tapnet Ltd – which operates adult site RevealMe – in relation to its response to Ofcom’s information request.
Ofcom found that companies are not prioritising risk assessments of their platforms, which it considers fundamental to identifying and mitigating risks to users proactively. This will be a requirement on all regulated services under future online safety laws.
“Today’s report is a world first,” declared Dame Melanie Dawes, Ofcom’s Chief Executive. “We’ve used our powers to lift the lid on what UK video sites are doing to look after the people who use them. It shows that regulation can make a difference, as some companies have responded by introducing new safety measures, including age verification and parental controls.”
“But we’ve also exposed the gaps across the industry, and we now know just how much they need to do. It’s deeply concerning to see yet more examples of platforms putting profits before child safety. We have put UK adult sites on notice to set out what they will do to prevent children accessing them,” she confirmed.
Unlike in its broadcasting work, Ofcom’s role is not to assess individual videos. In light of the massive volume of online content, it is impossible to prevent every instance of harmful content. Accordingly, Ofcom’s job is to make sure the platforms are taking effective action to address harmful content.
Over the next twelve months, Ofcom expects companies to set and enforce effective terms and conditions for their users, and quickly remove or restrict harmful content when they become aware of it. Ofcom will review the tools provided by platforms to their users for controlling their experience, and expect them to set out clear plans for protecting children from the most harmful online content, including pornography.
Forthcoming online safety laws will give Ofcom wider duties and powers, and many more services will be required to protect their users.
Ofcom says it will move quickly once the Online Safety Bill passes to put these laws into practice. “Tech firms must be ready to meet our deadlines and comply with their new duties, it asserts. “That work should start now, and companies need not wait for the new laws to make their sites and apps safer for users.”
In particular, Ofcom is encouraging all companies likely to be in scope to review how they assess risks to their users, explore opportunities for improvement, integrate trust and safety across product and engineering teams and staff up now to be ready for UK online safety laws.