NSPCC calls for more social network controls
Social network sites like Facebook should be forced to design extra protections for children into their platforms, the children’s charity’s Chief Executive urges.
Figures obtained by charity the NSPCC indicate that 3,171 new online grooming offences have been recorded in England and Wales across 80 social network platforms since a new anti-grooming law which criminalises sexual communication with a child was introduced. This amounts to almost nine grooming offences on average per-day, the children’s charity said.
The findings come ahead of the NSPCC’s conference ‘How safe are our children? Growing up online’ which starts on 20th June and explores the potential risks the online world poses to children and young people.
All 43 police forces in England and Wales and British Transport Police were asked for the number of recorded offences under s.15A of the Sexual Offences Act 2003 recorded between April 3 2017 and April 2 2018. In total, 41 of 44 police forces gave a full or partial response for the full 12-month period.
Age and gender of the victim were disclosed in 2,343 instances. Police are not always able to identify the victim in each case, the NSPCC reported, and in some cases there are multiple victims.
Where police disclosed the gender and age of the victim, records also revealed that girls aged 12-15 were recorded in 62% of cases and under-11s were recorded in nearly 25% of cases.
NSPCC also found that for the 2,097 offences where police recorded the method used to communicate with a child Facebook, Snapchat or Instagram were used in 70% of cases and that Facebook, Snapchat and Instagram were the top three most-recorded sites.
The NSPCC’s Wild West Web campaign has called on for Secretary of State for Digital, Culture, Media and Sport, Matt Hancock, to regulate social networks with the introduction of:
- An independent regulator for social networks with fining powers.
- A mandatory code which introduces Safe Accounts for children; grooming alerts using software algorithms; and fast-tracking of reports to moderators which relate to child safety.
- Mandatory transparency reports forcing social networks to disclose how many safety reports they get, and how they deal with those reports.
“These numbers are far higher than we had predicted. Social networks have been self-regulated for a decade, and it’s absolutely clear that children have been harmed as a result,” said Peter Wanless, Chief Executive at the NSPCC. “I urge Digital Secretary Matt Hancock to follow-through on his promise and introduce safety rules backed-up in law and enforced by an independent regulator with fining powers.”
Wanless added: “Social networks must be forced to design extra protections for children into their platforms, including algorithms to detect grooming to prevent abuse from escalating.”