What happens when an algorithm labels you as mentally ill?
SAINT JEROME, Canada — While you might think that your mental health is known only to you, your health practitioner or those closest to you, you might be unwittingly revealing it to strangers online. A series of emojis, words, actions or even inactions can communicate how you feel at a given moment and when collected over time, comprise your "socionome" — a digital catalogue of your mental health that is similar to how your genome can provide a picture of your physical health.
Today, a number of efforts have been made to design algorithms to scan online behaviors for markers of mental illness. Crisis Text Line, a messaging service that connects users to crisis counselors, uses a chatbot to flag callers at risk of suicide and bump them to the front of the helpline. The services' data scientists have found, for example, that when a caller texts the word "Advil" or "Ibuprofen" to Crisis Text Line, his or her risk of attempting suicide is up to 14 times higher than that of the average caller.