MadameNoire Featured Video

This week, Instagram announced a new service that helps its users identify troubling behavior within the community and reach out to followers who might need mental health help. The service, which is not yet available, lets users report posts in which certain negative phrases and hashtags are used, such as “I feel like nobody would miss me if I was gone” and “#selfharm.” The original poster would then get a message from Instagram indicating that one of their followers was concerned about them, and then point to critical mental health resources from within the app.

A screenshot of part of the new Instagram mental health tool.

A screenshot of part of the new Instagram mental health tool.

The announcement by Instagram is part of a partnership with Facebook, the National Suicide Prevention Lifeline and other organizations in an effort to help their communities deal with bullying, eating disorders, suicide and a number of other preventable mental health conditions. Facebook and Instagram currently have guides for talking with your friends online about the above situations. On Instagram, these guides are buried in the Privacy and Safety Center under Support; Facebook has put its documentation in the Community Standards section.

As I read about these initiatives, I’m heartened by the attention that these social media companies have paid to the health and safety of their users. As someone who lives with a mental illness, I pay attention to my friends’ language on social media as a matter of course. And I’d love for everyone to share my mindfulness of trigger words and phrases on Facebook or Instagram. But I’m still not sure of the efficacy of the new Instagram mental health tool and it’s ilk getting people help.

We’ve seen that people often hide behind their social media profiles, and do things there that they wouldn’t necessarily do in “real life” — like bullying celebrities and making death threats on blog posts. The same anonymity that distances people from each other in certain online circles could possible keep them from making positive outreach to the members of their community. Further, many people have Facebook and Instagram accounts that represent a persona rather than the actual person behind the account. That level of namelessness and distance might not be the appropriate place to encourage people to have difficult conversations about mental health.

We all use social media differently. For example, I only follow real life friends on Facebook, anyone can follow me on Instagram, and my Twitter is carefully curated for a specific point of view. Not everyone is that picky with their social media connections, but the way in which one uses the medium will further hamper the usefulness of the Instagram mental health tools. It’s much easier to reach out to your friends when you see that they might be in trouble. It’s not as easy to step out of your persona and engage with someone on a serious level. After all, many of us have friends and follower lists full of people we’ve never met who merely like our posts and send us animal pictures. It’s hard to move from a relationship based on kitten videos to one based on a possible mental health concern.

The new Instagram mental health alert process has a failsafe, a human that reviews flagged posts to determine whether the post in question is really harmful or is a joke or a misunderstanding. I feel good about that feature, as it would eliminate uncomfortable confrontations in which the reported user is angry for having been reported. But that still doesn’t address the fear or discomfort experienced by the reporting party, or the annoyance at getting a note from Instagram or Facebook that someone thinks you’re losing your marbles. Social media is made up of people, but social media companies are still seen as nameless, faceless entities that impose new features we don’t like and shut down when we need them. With sensitive subjects like eating disorders and suicide, I can see users feeling violated as though Instagram has disregarded their privacy.

All in all, I think providing more resources to promote better mental health is a good thing. But our society is still at the point where mental health topics are often verboten and not spoken about in public. I think that injecting hidden resources and unpromoted services to our social media accounts is too passive an activity to make a huge difference. However, if one suicide is prevented, or one person gets help for an eating disorder or a bullying incident, then the effort to launch these initiatives will have been worth it.

Comment Disclaimer: Comments that contain profane or derogatory language, video links or exceed 200 words will require approval by a moderator before appearing in the comment section. XOXO-MN