Lifestyle

Therapists Might Start Using Social Media To Track Your Mental Health

by Emily Arata

In 2012, a woman named Melissa Broder created a Twitter account she modestly calls So Sad Today.

The black comedy account plumbed the depths of Broder's brain, pulling out one-liners that slowly gained a reputation from relatability. Four years later, Broder is the author of an essay collection and tweets to an audience of nearly 400,000 people, proving depression is a cause we can all get behind. Her so-called “university of existential loneliness” has found an avid audience.

Today, we can laugh at and retweet Broder's epithets, but that might not always be the case. A new article from Fast Company's Sean Young, a UCLA medical school professor, imagines a future in which intelligent programs screen individual Facebook and Twitter accounts — both text and images — for signs a mental illness is setting in, or a patient is potentially experiencing depression.

According to the report, teaching a program to recognize harmful patterns isn't actually that difficult, provided it knows which word combinations and patterns to seek out.

In 2016, social media is an inextricable part of the average person's life. We (sometimes problematically) share a story about the co-worker who really annoys us or the loneliness we feel at 1 am. Being honest is easy, since it often feels as if you're typing into a void.

In fact, a 2015 study from the University of Missouri proved envy caused by Facebook often leads to symptoms of depression among users, and it's far from the only research to conclude that.

Imagine this future: A patient hospitalized for a suicide attempt hands over his or her Facebook information and Twitter handle. The hospital's computerized monitor observes that person's posts after returning home, watching for buzzwords like “loneliness” and “sadness.” Too many instances, and it'll trigger an automatic report sent to a hospital case worker or a therapist.

The only hindrance, as Fast Company points out, is the expectation that mental health professionals should be responsible for monitoring their patients at all times. Would everyone take a shift following up on social media warning reports, or would the responsibility fall on the same person all the time? Would action be mandated, should signs of depression appear?

Responsibility aside, Young has a point that's backed by data. In 2013, a study of first-year college students revealed the majority of those who'd previously expressed depression on social media would be open to in-person intervention by a peer or person in power.

For the first time, social media might have a result that benefits us all.

Citations: Soon Your Doctor Will Check Your Tweets For Signs You're Depressed (Fast Company)