They've been under pressure to tackle bullying on the platform.
Instagram is taking steps to stamp out bullying on the app with a new tool that asks users ‘Are you sure?’ before they post potentially offensive comments.
The new feature uses AI to identify harmful words in a comment, then notifies users that their comment may be considered offensive before its posted.
It’s hoped that this “intervention” will make people “reflect and undo their comment” before the recipient has a chance to see the notification.
Of course, they can ignore the pop-up and post anyway, but Instagram says that early tests found it encourages some users to undo their comment and share something less hurtful.
They’re also giving users the option to ‘restrict’ certain accounts, noting that younger people are often reluctant to block, unfollow or report a bully for fear of escalating the situation.
When you ‘restrict’ a user, comments on your post from that person will only be visible to them, unless you approve them – they will not be able to see when you’re active on the app, or when you’ve read a DM from them.
Instagram’s chief executive Adam Mosseri said in a blog post that the company realised it “could do more” about the issue.
According to the BBC, the app has been under increased pressure to take action after high profile cases of bullying on the platform, including the suicide of British teenager Molly Russell.
“We can do more to prevent bullying from happening on Instagram, and we can do more to empower the targets of bullying to stand up for themselves,” he wrote.
These tools are grounded in a deep understanding of how people bully each other and how they respond to bullying on Instagram, but they’re only two steps on a longer path.