January 1, 2018 (STL.News) Up until now, there wasn’t much protection on Instagram if you made your profile public. All of the content is up for grabs, and anyone can view and comment on your photo if they find it in the search results. Well, recently, Instagram announced users can now use moderation tools that make it easier to silence anyone being potentially offensive. As a way to celebrate its milestone of reaching 800 million users on the platform, with 500 million people using it daily, Instagram announced the new tool as a way to combat cyber-bullying and other kinds of mentally depressing activity that can happen on the app.
The new comment tools mean that if you have a public account on Instagram, you can now limit comments to specific groups of people, including your followers as well as the people you follow. Additionally, both public and private account holders will be able to block accounts from commenting on their posts, instead of having to make their accounts private altogether.
The new comment feature is part of Instagram’s ongoing effort to curb abuse on the platform, which has contributed to depression, suicide, and other types of body-related disorders. Instagram is also updating the automated filter to block especially offensive comments recently with new tools that cover Arabic, French, German, and Portuguese-language content. The feature already exists in English at this time. Instagram said this feature will continue to improve with time and use, blocking out additional profanity and threatening language as time progresses.
Lastly, Instagram has expanded a feature it launched in May that is designed to help users in need of support and resources related to mental illness. If you’re on Instagram, and you witness someone displaying mental health related tendencies, you can now report it anonymously to Instagram. Instagram will send them a message offering to help with support hotlines.