SAFE
CIC
The Safeguarding Specialists
01379 871091

SAFE Newsfeed

Prosecutors are working on the highest ever number of hate crime cases as referrals from police hit record levels.

Source: Crown Prosecution Service (CPS) published on this website Thursday 15 January 2026

The Crown Prosecution Service’s latest performance data for July to September 2025 released today (January 15) shows it received 4,358 cases from police which have been flagged as having a hate crime element. This is a 14.7 per cent increase on the previous quarter – April to June 2025 – and 2.8 per cent more than the same period in 2024.

Prosecutors charged 88.1 per cent of hate crime cases during the three months. In total 4,079 prosecutions were completed during this time, 85 per cent of which resulted in a conviction. Four out of five convictions received a hate crime uplift in the length of the criminals’ sentencing.

Racially motivated hate crimes make up 3,098 of the total hate crime flagged referrals, with homophobic cases at 911 and religiously motivated crimes at 193.

The CPS will continue to monitor data trends and performance, and work with partners to better understand and respond to any shifts in offending patterns.

The CPS has also responded to the government's initial hate crime review to identify where the law can be strengthened to enhance our ability to prosecute, deter offenders and achieve justice for victims.

Lionel Idan, Hate Crime lead and Chief Crown Prosecutor, said:

“It’s deeply concerning to see that hate crimes are now at record levels as we know just how deeply this affects victims and their wider communities.

“Despite this increase in offences, our conviction rates show that when cases come to us, they result in real consequences for those who perpetrate such crimes. I would urge anyone who is a victim of hate crime to come forward and report to the police.”

Despite completing a total of more than 120,000 prosecutions for all crime in this time - 3.4 per cent more than the previous quarter – the live caseload increased by 3.7 per cent to more than 201,000 cases which is the highest number of cases since the pandemic.

Rise in ‘million-pound placements’ as vulnerable children with additional needs are housed illegally in caravans, holiday camps and AirBnBs

Source: Children’s Commissioner published on this website Wednesday 14 January 2026 by Jill Powell

  • Second report into illegal children’s homes by Children’s Commissioner shows one year on, vulnerable children are still being housed in caravans, holiday camps or AirBnBs – some for as long as three years 
  • Number of illegal placements costing £1 million per child rises, despite these settings being unable to provide safety or care – at an estimated total cost of £353 million to the taxpayer 
  • Most children placed in illegal homes have mental health or additional educational needs – more than half have Education, Health and Care Plans 
  • “This is what failure looks like”: Children’s Commissioner calls for specialist foster care, more children’s homes and a new focus for reforming children’s social care 

More than half of children housed illegally by councils have Education, Health and Care Plans, new data from the Children’s Commissioner confirms – as the number of illegal placements costing more than £1 million per child has risen since last year. 

One year on from the Commissioner’s first report into local authorities’ use of illegal homes – including AirBnBs, holiday camps and caravans – to accommodate children in care, data shows very little has changed: on 1st September this year, there were 669 children living in illegal homes, down from 764 on the same day last year.  

Nearly 60% of these children have complex additional needs or disabilities requiring an Education, Health and Care Plan (EHCP), meaning they likely also receive support from other services beyond social care, while more than one third (36%) are receiving support from child and adolescent mental health services (CAMHS). 

Of the 669 children placed illegally, 89 have been living in the same illegal placement for more than one year. While most are over 15, there are some children of pre-school age growing up in illegal children’s homes.  

The average duration of these illegal placements is a little over six months. One child was at put in a holiday camp for nearly nine months, another was in a caravan for more than four months and a handful of children remained in an illegal home for more than three years. 

The average weekly cost of a placement was more than £10,000 – the equivalent to more than half a million pounds over the course of a year. In total councils across England have spent an estimated £353 million on illegal children’s homes in 2025, of which 36 placements had already cost £1 million each by 1st September.  

Today’s data underscores the crisis in children’s social care, with children – many extremely vulnerable or with complex needs – placed in poor quality placements at an exorbitant cost to taxpayers.  

A man who abused multiple children has been jailed for six years.

Source: Staffordshire Police published on this website Monday 12 January 2026 by Jill Powell

Adam McLaughlan, 34, of Penkridge, was sentenced at Stafford Crown Court on Tuesday (6 January) after he was previously found guilty of 14 counts of sexual assault of a girl.

The abuse happened over a number of years in Staffordshire before the survivors bravely came forward and told the police about what happened.

Officers from Staffordshire public protection unit (PPU) worked tirelessly to support the survivors and to build the evidence needed to secure a significant number of charges against McLaughlan.

At an earlier court hearing, he denied all of the charges put to him. But, because of the strength of the investigation and the bravery of the survivors, the jury deemed him guilty on 14 counts.

As part of his sentencing, McLaughlan was also served with an indefinite Sexual Harm Prevention Order (SHPO), placed on the Sex Offenders Register and served with a restraining order against the survivors.

Detective Constable Patrick Hipwell, from Staffordshire public protection unit (PPU), said:

“I’m pleased that we’ve been able to deliver this outcome for the survivors in this case, who have each shown tremendous courage in coming forward and telling us about what happened.

“I want survivors to know that, no matter how recent or long ago offences may have taken place, we will always work relentlessly to delivery justice and to support you thoroughly throughout this process.”

If you’re a survivor and you feel ready to talk to us, call 101 or use Live Chat on Police website.

Ofcom launches investigation into X over Grok sexualised imagery

Source: Ofcom published on this website Tuesday 13 January 2026 by Jill Powell

The UK’s independent online safety watchdog, Ofcom, has today opened a formal investigation into X under the UK’s Online Safety Act, to determine whether it has complied with its duties to protect people in the UK from content that is illegal in the UK.

There have been deeply concerning reports of the Grok AI chatbot account on X being used to create and share undressed images of people – which may amount to intimate image abuse or pornography – and sexualised images of children that may amount to child sexual abuse material (CSAM).

 As the UK’s independent online safety watchdog, Ofcom urgently made contact with X on Monday 5 January and set a firm deadline of Friday 9 January for it to explain what steps it has taken to comply with its duties to protect its users in the UK.

The company responded by the deadline, and we carried out an expedited assessment of available evidence as a matter of urgency

Ofcom has decided to open a formal investigation to establish whether X has failed to comply with its legal obligations under the Online Safety Act – in particular, to: 

  • assess the risk of people in the UK seeing content that is illegal in the UK, and to carry out an updated risk assessment before making any significant changes to their service;
  • take appropriate steps to prevent people in the UK from seeing ‘priority’ illegal content – including non-consensual intimate images and CSAM]
  • take down illegal content swiftly when they become aware of it;
  • have regard to protecting users from a breach of privacy laws;
  • assess the risk their service poses to UK children, and to carry out an updated risk assessment before making any significant changes to their service; and
  • use highly effective age assurance to protect UK children from seeing pornography.[

The legal responsibility is on platforms to decide whether content breaks UK laws, and they can use our Illegal Content Judgements Guidance when making these decisions. Ofcom is not a censor – Ofcom do not tell platforms which specific posts or accounts to take down.

Ofcom’s job is to judge whether sites and apps have taken appropriate steps to protect people in the UK from content that is illegal in the UK and protect UK children from other content that is harmful to them, such as pornography.

The Online Safety Act sets out the process Ofcom must follow when investigating a company and deciding whether it has failed to comply with its legal obligations.

Ofcom’s first step is to gather and analyse evidence to determine whether a breach has occurred. If, based on that evidence, we consider that a compliance failure has taken place, we will issue a provisional decision to the company, who will then have an opportunity to respond the findings in full, as required by the Act, before Ofcom will make their final decision.

If the investigation finds that a company has broken the law, Ofcom can require platforms to take specific steps to come into compliance or to remedy harm caused by the breach. Ofcom can also impose fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater.

In the most serious cases of ongoing non-compliance, Ofcom can make an application to a court for ‘business disruption measures’, through which a court could impose an order, on an interim or full basis, requiring payment providers or advertisers to withdraw their services from a platform, or requiring internet service providers to block access to a site in the UK. The court may only impose such orders where appropriate and proportionate to prevent significant harm to individuals in the UK

UK jurisdiction

In any industry, companies that want to provide a service to people in the UK must comply with UK laws. The UK’s Online Safety Act is concerned with protecting people in the UK. It does not require platforms to restrict what people in other countries can see.

There are ways platforms can protect people in the UK without stopping their users elsewhere in the world from continuing to see that content.

An Ofcom spokesperson said:

 “Reports of Grok being used to create and share illegal non-consensual intimate images and child sexual abuse material on X have been deeply concerning. Platforms must protect people in the UK from content that’s illegal in the UK, and we won’t hesitate to investigate where we suspect companies are failing in their duties, especially where there’s a risk of harm to children.

“We’ll progress this investigation as a matter of the highest priority, while ensuring we follow due process. As the UK’s independent online safety enforcement agency, it’s important we make sure our investigations are legally robust and fairly decided.”

Ofcom will provide an update on this investigation as soon as possible.

Supporting safeguarding partners to discharge their local child protection arrangements. (England)

Source: Department for Education published on this website Friday 9 January 2026 by Jill Powell

The Children’s Wellbeing and Schools Bill (the bill) is a key step towards delivering the government’s opportunity mission to break the link between children’s background and their future success.

The child protection measures strengthen multi-agency responses to significant harm. Clause 3 creates a new duty for safeguarding partners – local authorities, police and integrated care boards – to establish multi-agency child protection teams (MACPTs).

It allows the Secretary of State for Education to use regulations to further prescribe the:

  • functions of MACPTs
  • qualification and skills requirements for MACPT members
  • relevant agencies that safeguarding partners can approach to facilitate the operation of MACPTs

The purpose of this policy statement is to provide clarity on the intended scope and content of the regulations. It seeks to address concerns raised in the House of Lords about the operation of MACPTs and level of prescription in regulations. The information set out is the first step in developing the regulations.

To develop regulations further, we will:

  • engage with sectors
  • use existing best evidence of effective multi-agency working
  • use emerging evidence from the Families First for Children pathfinder areas (pathfinders)
  • use emerging evidence from the implementation of the Families First Partnership (FFP) programme national rollout

Regulations will be subject to consultation and robust Parliamentary scrutiny with the aim for them to come into force in late 2027, subject to royal assent.

To read the Multi-agency child protection teams: regulation-making powers policy paper published 7 January 2026