Now, Instagram will warn you against posting hate comments
In a major move, Instagram has announced two new tools to combat the pressing problem of bullying. The features are aimed at preventing users from posting hateful/demeaning comments and empowering those targeted by them to stand up for themselves. The options will be available to users around the world in the coming months. Here's all you need to know.
Warnings to prevent offensive, hateful comments
People use comments as a weapon to fire shots of hate and offense online. There's no way to stop this practice, but the photo-sharing app will now show pop-ups to give people a chance to reflect on their decision. The pop-up will appear on detecting an offensive message and ask the user if they really want to say something bad on someone else's photo/video.
AI will be used to flag offensive comments
The warnings will show up as and when Instagram's AI detects something offensive in the comments section, CEO Adam Mosseri said while announcing the new tool. He also claimed that early tests of the feature have shown people feel encouraged to write something less hurtful once they get a chance to reflect on what they had written in the first place.
Also, Instagram will let users 'restrict' bullies
Instagram is also empowering potential targets of bullying to stand up for themselves. You will be able to 'Restrict' your bully - a feature that would ensure that any comment shared by that person isn't visible to any other party except the bully himself. Notably, the capability would also keep them from seeing when you are available or if you have seen their DM.
When these features will be available
While the comment warning feature has already started rolling out, Restrict would be available in the coming weeks. The options would make Instagram an even safer place to communicate, share, and interact and reduce the amount of hate we have been seeing on the platform. In any case, it is far more safe than Twitter. Not to mention, Restrict would help users avoid bullies.