Six child sexual image offences a day in Wales, new figures show

Iolo Cheung
BBC Wales News
Getty Images Social media apps on an Apple iPhone. Getty Images
Half of the recorded sexual offences used by perpetrators took place on Snapchat, according to the NSPCC

Welsh police forces logged six offences a day relating to child sexual abuse images last year, new figures show.

One campaigner has criticised the use of disappearing messages on apps such as Snapchat, after the app was noted in 50% of cases involving social media platforms.

Children's charity NSPCC have called for tougher action from politicians and social media companies, as there can be "no excuse for inaction or delay".

The UK government said it was "committed to the robust implementation of the Online Safety Act", and had already passed laws to crack down on child sexual abuse online.

Home Office figures showed 2,194 child sexual abuse image crimes were recorded in Wales last year,

South Wales Police recorded the most with 964, followed by North Wales, which had 535.

There were 503 child sexual abuse image crimes reported to Gwent Police, and 192 to Dyfed-Powys Police.

It covers a wide range of offences from possessing, making, distributing or publishing child abuse material, to sharing or coercing someone under-18 to send indecent images.

In a separate Freedom of Information request, the NSPCC found that of the offences in England and Wales where a social media platform used by the perpetrator was recorded, half had taken place on Snapchat.

Other social media sites included 11% on Instagram, 7% on Facebook and 6% on WhatsApp.

Last year Childline delivered 903 counselling sessions to children and young people relating to blackmail or threats to expose or share sexual images online - a 7% increase compared to 2022/23.

Snapchat offences 'a wake-up call'

Mared Parry from Wales, who now lives in London, was 14 when she was groomed online by men who manipulated her into sending sexual images of herself.

The presenter and journalist is also an online safety campaigner who has worked with the NSPCC, and described the scale of the problem as "horrifying".

"Evolving technology seems to have made it easier for groomers to get away with their crimes when it should be the opposite," she said.

"Online abuse has very real consequences."

Mared Parry Mared Parry - presenter, journalist and online safety campaigner. She has blonde mid length hair parted in the middle, blue eyes, and looks blankly into the camera. Mared Parry
Mared Parry worked with the NSPCC on their Wild West Web campaign, and was part of efforts to push for the Online Safety Act 2023

Snapchat being the platform used in half the recorded offences should also be a "wake-up call", said Ms Parry.

"Its disappearing messages, lack of accountability, and emphasis on privacy create the perfect conditions for abuse to go undetected.

"It's already difficult enough for victims to prove abuse, and features like this just make it even easier for offenders to cover their tracks.

"Yet, tech companies continue to prioritise user engagement over safeguarding, and the consequences are devastating."

What should you do if you are threatened online?

Being threatened online when it comes to sexual images can be a frightening experience. The Internet Watch Foundation (IWF) has the following advice:

  • Remember that you are not at fault if approached or threatened online. The person trying to blackmail or sexually extort you is the one who is in the wrong, lots of other young people have been in a similar situation.
  • Stop all contact with anyone who is trying to threaten you, and do not share any more images or videos or pay any money of any sort. If you have been communicating on an app, there should be in-built tools to block and report the user.
  • You will not be in trouble with the police - report what has happened to your local police on 101 or by making a report to the National Crime Agency's CEOP Safety Centre, where a child protection advisor will make sure you get the help you need.
  • You can use an online tool called Report Remove. The IWF will then try to have the sexual images, videos, or links removed from the internet. You can also talk to Childline, who have provided support to others in the same situation.
  • For parents, it is also vital to have open and honest conversations with your children about the risks online and to listen to their concerns.

Tech bosses 'let off the hook'

The NSPCC have issued a joint call with other charities, including Barnardo's and the Marie Collins Foundation, for the UK government to give regulator Ofcom greater powers.

Currently user-to-user services are only required to remove illegal content where it was "technically feasible", according to Ofcom, something the charities have criticised as an "unacceptable loophole".

The charities said children will not be protected from the worst forms of abuse on private messaging services under Ofcom's current plans.

But with most of the offences taking place on private messaging sites, the NSPCC also claim companies need to introduce "robust safeguards" so their sites are not "a safe haven for perpetrators of child sexual abuse".

"These offences cause tremendous harm and distress to children, with much of this illegal material being repeatedly shared and viewed online," said NSPCC chief executive Chris Sherwood.

"It is an outrage that in 2025 we are still seeing a blatant disregard from tech companies to prevent this illegal content from proliferating on their sites," he added.

"Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place.

"This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act."

Getty Images fingers tapping on a smartphone screen. Getty Images
The NSPCC, Barnardo's and the Marie Collins Foundation are calling on the UK government to give Ofcom greater powers to protect children

In a statement, a Home Office spokesperson described child sexual exploitation and abuse as despicable, and said tech company design choices cannot be used as an excuse not to root out "heinous crimes".

"UK law is clear: child sexual abuse is illegal and social media is no exception, so companies must ensure criminal activity cannot proliferate on their sites.

"We have already introduced four new laws to crack down on child sexual abuse online and we will not hesitate to go further to protect children from vile online predators."

A Snapchat spokesperson condemned any sexual exploitation on the platform and said it works with law enforcement agencies to identify information and content if necessary.

"Whether that's through our proactive detection efforts or confidential in-app reporting, we remove it, lock the violating account, and report to authorities.

"Snapchat is designed to make it difficult for predators to find and interact with young people and has extra safeguards in place to help prevent strangers from connecting with teens.

"Our Family Centre also allows parents to see who their teens are friends with and talking to on Snapchat.

"We work with expert NGOs, and industry peers to jointly attack these problems and don't believe the methodology used in this report reflects the seriousness of our collective commitment and efforts."