Parents back charity’s call for tough social media regulation

An NSPCC survey reveals support from parents for the introduction of a social media regulator to ensure tech firms protect children.

Joe Lepper | 15th Feb 19
Image shows a child looking at an ipad screen

Nine out of ten parents want to see a social media regulator appointed to make social network firms legally responsible for protecting children, according to a charity’s survey.

The NSPCC survey also found that six out of ten adults do not think social networks protect children from risks such as sexual grooming and inappropriate content around self-harm and suicide.

The survey findings have been released to raise awareness of the charity’s Taming the Wild West Web report, which sets out how a regulator would enforce a legal duty of care to children on social networks.

The charity wants a regulator to have legal power to demand details from tech firms about their child safety measures as well as require them to set minimum safeguarding standards and proactively tackle online risks.

Tougher sanctions

The regulator should also be able to put in place tough sanctions against social networks that fail to protect children, says the charity, including fines of up to £20m and a new criminal offence for gross breaches.

An online petition has been set up by the NSPCC urging people to back their call for a regulator and other recommendations including the creation of safe accounts for children. More than 28,000 people have signed the petition to date.

The government is due to publish an Online Harms White Paper, to set out expectations for companies to keep users, in particular children, safe.

NSPCC Chief Executive Peter Wanless is urging minister to ensure this puts a legal duty of care on social networks.

He said; “It is clear that society will no longer tolerate a free for all under which tech firms allow children to operate in a precarious online world with a myriad of preventable risks.

“Social media bosses should be made to take responsibility for the essential protection of children on their platforms and face tough consequences if they don’t. Over a decade of self-regulation has failed, and enough is enough.”

Want to stay on top of the latest tech news in the third sector?

Get top insights and news from our charity digital experts delivered straight to your inbox three times per week.

Subscribe