Instagram has launched a new system that aims to prevent offensive comments being posted on photos shared on the platform.
The social network says it is a step towards keeping Instagram a “safe place for self-expression”.
They hope the filter will help stop bullying and abuse taking place on the platform.
In May 2017 it was rated the worst social media platform for young people’s mental health.
Instagram is now making visible efforts to change that perception.
“We know that toxic comments create negativity and discourage people from sharing. But it’s also a hard problem to solve,” the company says in an official statement.
Of the millions of comments left every day, the statement adds “only a small fraction of these are inappropriate.”
Image caption Instagram say their process is currently 99% accurate and they are hoping this number will improve
The company is now employing people to teach computer systems to identify abusive comments, which will then monitor comments in various languages.
“The training process has taken several months, and has involved the review of more than 2 million comments,” says Michelle Napchan, Instagram’s head of public policy.
“We know that this isn’t easy. It’s not just the words themselves — context is incredibly important to determining what’s an offensive comment.”
She says the computers will be looking at not only the words used in comments but sequence and combinations as well as the relationship between the person posting the comment and the one who shared the photo.
Instagram previously used a similar system to help remove spam comments from photos and says most people will not notice anything different in their Instagram feed.