Instagram cracks down on online bullying

The social media firm is asking people to thing again before posting harmful content is looking to help users restrict posts from others.

Joe Lepper | 10th Jul 19
Image of Instagram logo

Instagram has taken action to combat online bullying with new measures including using artificial intelligence to warn people they may be posting harmful comments.

The social media platform is using its artificial intelligence technology to generate a notification to people that their comment may be considered offensive before they have posted it.

This builds on technology which has already been able to detect bullying and harmful content within comments, photos and videos.

> See also: Report reveals cyber bullying within charities

The measure has been taken to protect younger users in particular who are most likely to experience online bullying and less likely to report incidents.

Early tests of the feature have shown that it has encouraged people to undo their comment, reflect on their actions and share less hurtful content instead.

Restrict function

In addition, Instagram is testing a new function called ‘restrict’ to be used by users. This allows people to restrict comments made on their posts by others by ensuring that it is only visible to that person. Those restricted will also not be able to see when a user is active on Instagram or when direct messages have been read.

Adam Mosseri, head of Instagram, said: “It’s our responsibility to create a safe environment on Instagram. This has been an important priority for us for some time, and we are continuing to invest in better understanding and tackling this problem.”

Instagram asks ‘are you sure you want to post this?’

This week Action for Children revealed that online and face to face bullying is the top concern among young people.

Also this month the NSPCC revealed that nine out of ten children back moves to ensure tech firms have a legal responsibility to keep them safe online.

The government’s Online Harms whitepaper has outlined plans to set up an independent regulator for the online sector and place on tech firms a duty of care for users.

Last month a report by the Association of Chief Executives of Voluntary Organisations (ACEVO) and Centre for Mental Health found that a quarter of bullying victims within charities themselves have experienced cyber bullying.