Though initially launched as a photo-sharing app, Instagram is now a great platform to make friends, grow businesses, and work for a cause. But, being one of the highest-used social media platforms, Instagram is nothing new to cyber-bullying. And, in order to curb the growing number of reports against the same, Insta has rolled out a new feature that’ll detect inappropriate/ offensive language in the comments that you type. The AI used will automatically warn the user that the comment is abusive and will give an option to undo or rephrase the comment before posting. This feature will be available once you update the app is updated to its newest version.
It was only a month ago since Instagram released its ‘Restrict Feature’ that you can essentially use to shadow-ban an account from posting hurtful comments. This means that any comment from the restricted account will not be visible to anyone except you and the user who posted it. The significance of this feature is that it’ll not be notified to the person you’ve restricted.
With an increase in the reported number of users falling prey to depression and suicides linked to social media platforms, it is a great initiative by Instagram to fight cyber-bullying and keep it a safe and supportive place for the generations to come.