Public backs tighter social media moderation
Survey of 2,000 adults adds to calls for social media firms to help protect vulnerable users.
The general public wants social media firms to do more to tackle concerns about online harm facing young and vulnerable people.
A report by think tank Demos, called Quality Control, found that 59 per cent of adults believe that social media content should be edited by moderators to help tackle harmful material.
Almost half of those who want to see tougher social media moderation believe the move would help reduce self-harm and suicide and the proliferation of ‘fake news’. Around a third believe it would help combat other mental health conditions, and tackle terrorism.
Last month a call was issued to government to ensure charities are at the heart of efforts to improve online safety. Charity consultancy New Philanthropy Capital wants ministers to be more collaborative with the voluntary sector to ensure users are protected online.
The government has unveiled plans in its Online Harms White Paper to set up an independent regulator for the online sector and place a duty of care on firms to protect users.
Earlier this year children’s charity NSPCC revealed that nine out of ten parents back the creation of a regulator for social network firms.
Instagram is among social media platforms to have already taken action to tackle online bullying by using artificial intelligence technology to generate a notification to people that their comment may be considered offensive.
Demos surveyed 2,000 adults and found an age divide in terms of attitude to social media moderation. While 71 per cent of those aged 55 and over backed the move, only 45 per cent of younger adults supported moderation.
Training for ‘citizen editors’
Being recommended in the report, which is supported by the Publishers Association, is the creation of a voluntary ‘citizen editors’ training scheme to help current social media moderators more effectively manage potentially harmful content.
“Up until now, too many conversations around online harm and social media have centred around the way in which new content is consumed,” said Andrew Gloag, Demos Research Assistant.
“However, this report shows that we cannot escape the fact that much of the harm is caused by the substance rather than the style of content we’re interacting with.
“There’s also a clear demand for change: the majority of people want content to be edited, and just one in ten trust the information they see on social media. Fundamentally, we need to find a way of incentivising the creation of high quality content, across all mediums.”