SAFE
CIC
The Safeguarding Specialists
01379 871091

First codes of practice and guidance published, on new duties for tech firms under the UK’s Online Safety Act

Source: Ofcom published on this site Wednesday 18 December 2024 by Jill Powell

Ofcom has, four months ahead of the statutory deadline, published its first-edition codes of practice and guidance on tackling illegal harms – such as terror, hate, fraud, child sexual abuse and assisting or encouraging suicide[ – under the UK’s Online Safety Act.

The Act places new safety duties on social media firms, search engines, messaging, gaming and dating apps, and pornography and file-sharing sites. Before they can enforce these duties, they are required to produce codes of practice and industry guidance to help firms to comply, following a period of public consultation.

Ofcom have consulted carefully and widely to inform their final decisions, listening to civil society, charities and campaigners, parents and children, the tech industry, and expert bodies and law enforcement agencies, with over 200 responses submitted to their consultation.

As an evidence-based regulator, every response has been carefully considered, alongside cutting-edge research and analysis, and we have strengthened some areas of the codes since our initial consultation. The result is a set of measures – many of which are not currently being used by the largest and riskiest platforms – that will significantly improve safety for all users, especially children.  

What regulation will deliver

The illegal harms codes and guidance markshttps://www.ofcom.org.uk/ a major milestone in creating a safer life online, firing the starting gun on the first set of duties for tech companies. Every site and app in scope of the new laws has from today until 16 March 2025 to complete an assessment to understand the risks illegal content poses to children and adults on their platform.

Subject to the codes completing the Parliamentary process by this date, from 17 March 2025, sites and apps will then need to start implementing safety measures to mitigate those risks, and the codes set out measures they can take.[4] Some of these measures apply to all sites and apps, and others to larger or riskier platforms. The most important changes Ofcom expect the codes and guidance to deliver include:

  • Senior accountability for safety. To ensure strict accountability, each provider should name a senior person accountable to their most senior governance body for compliance with their illegal content, reporting and complaints duties. 
  • Better moderation, easier reporting and built-in safety tests. Tech firms will need to make sure their moderation teams are appropriately resourced and trained and are set robust performance targets, so they can remove illegal material quickly when they become aware of it, such as illegal suicide content. Reporting and complaints functions will be easier to find and use, with appropriate action taken in response. Relevant providers will also need to improve the testing of their algorithms to make illegal content harder to disseminate. 
  • Protecting children from sexual abuse and exploitation online. While developing the codes and guidance, Ofcom heard from thousands of children and parents about their online experiences, as well as professionals who work with them. New research, published today, also highlights children’s experiences of sexualised messages online[4], as well as teenage children’s views on our proposed safety measures aimed at preventing adult predators from grooming and sexually abusing children.[5] Many young people we spoke to felt interactions with strangers, including adults or users perceived to be adults, are currently an inevitable part of being online, and they described becoming ‘desensitised’ to receiving sexualised messages.

Taking these unique insights into account, our final measures are explicitly designed to tackle pathways to online grooming. This will mean that, by default, on platforms where users connect with each other, children’s profiles and locations – as well as friends and connections – should not be visible to other users, and non-connected accounts should not be able to send them direct messages. Children should also receive information to help them make informed decisions around the risks of sharing personal information, and they should not appear in lists of people users might wish to add to their network.