Facebook, Twitter, Instagram, texting—we’re on our smartphones all the time, expressing how we feel and what’s going on in our lives. And for every text or status update, a person is typing… a person who could be happy or manic, depressed or schizophrenic, have substance abuse problems or not. Most of the time, for most people, we’d never know whether they needed help or could benefit from a mental health professional. But in the age of data and machine learning, we can start to.
Natural language processing, or NLP, is the set of tools that computer scientists use to analyze day-to-day human communication and text. When you talk to Siri, have your email automatically classified as spam, or have your words changed by autocorrect, you’re seeing NLP methods at work. NLP is a broad class of tools that can be used in any domain containing human language (English is a “natural language,” as is Chinese, but programming languages like Python or Java are not). As such, NLP is ripe to be applied to an age-old problem: mental health.
Psychotherapy, or talk therapy, has evolved since the armchair age, and clients today may be found Skyping their therapists, texting their therapists, texting crisis lines, or sharing on online forums. While most of these types of therapy haven’t been scientifically tested, researchers have recently run studies using specific web-based mental health interventions. Many chat-based interventions, for example, were found to be beneficial to patients, especially compared to patients receiving no treatment at all.
As text-based therapist-client interactions increase, and as people express their thoughts more and more on social media, there is a great opportunity to use NLP methods to help people who are suffering. We already see real-life applications of NLP with preliminary screening, or determining if someone has a mental health condition. NLP also could shine in therapist assistance, or suggesting appropriate tools for therapists to use with patients. And chatbots are in some ways the most intuitive applications of NLP, demonstrating another dimension of its promise and challenges.
Oftentimes, the problem of treating mental health disorders is of identifying people who may need help. NLP has great potential in identifying these individuals, and Facebook has already been taking action in this direction. Facebook now uses NLP to detect suicidal language and provides a list of resources to users to help support them. Another study developed a neural network model to analyze Instagram photos, captions, and comments, and found that they could detect alcohol abuse, but not other substance abuse, solely through this Instagram data. (The researchers had participants fill out an independent substance use screener to determine the ground-truth of whether participants were abusing substances or not.) A nonprofit, the Crisis Text Line, uses NLP text analysis to select for and respond first to people who are most at risk of suicide.
Analyzing information from social media, smartwatches, or texts are examples of passive data collection, and NLP can also be applied to active data collection, wherein participants are actually asked and answer questions. Researchers have demonstrated that they can determine which participants have depression by analyzing text from a chatbot going through diagnostic screening with participants. NLP may have an important role in screening people for mental health conditions in the future, whether by analyzing passively-collected data, or by actively administering and interpreting screening interviews.
However, care will have to be taken to ensure the appropriate levels of privacy and data security are maintained if these systems become more automated. If Facebook is concerned about you, should they contact your friends (which ones?) and family? If your texts with a therapist become public through a data breach, what are the implications for your health insurance and résumé? Deep structural and ethical questions remain. Yet, if properly implemented, automated NLP systems could ensure that many more people who need help are connected to the resources to get it.
Once people have access to therapy, it is important that they are being provided quality help, and NLP has the potential to improve psychotherapy by assisting therapists. The main method that researchers use to determine whether a particular therapy is beneficial to patients is the randomized controlled trial. Randomized controlled trials often involve a head-to-head comparison of a psychotherapy technique against a control (like another psychotherapy technique, or a population who isn’t receiving therapy). However, researchers in the NLP world have been trying something different—analyzing large volumes of therapist-client interactions to determine whether better therapists have specific patterns that other therapists could learn from.
Transcripts of therapist-client interactions are hard to come by, and one large study solved this problem by getting access to and analyzing data from a large corpus of text-based interactions between volunteers and people texting a crisis line. The researchers first selected only the data that had a ground-truth label of whether the conversation was helpful or not (the people texting the line answered a follow-up question after their conversation, stating whether they felt worse, the same, or better), and then analyzed the 15,000 text conversations that qualified. The researchers then selected two groups from the volunteer helpers, those who had the highest rate of success (texters saying they felt better) and those who were less successful. The researchers found actionable differences between the two types of volunteers, finding that the more successful volunteers wrote more, spent more time in the problem-solving stage after quickly understanding the situation, and used more non-templated responses, among other differences. Another set of researchers, unsatisfied by the lack of transcripts, developed an automated speech recognition system that could analyze therapist-patient audio recordings and give therapists rapid feedback on their skill for a specific psychotherapy approach. As NLP improves and if more datasets become available, researchers will be able to extract even better insights from large volumes of text.
In some ways, the most comprehensive application of NLP to psychotherapy is the chatbot: an automated therapist that can be accessed anytime, anywhere, for free. The most famous early example of a therapy chatbot was ELIZA in the 1960s, which simulated a Rogerian psychotherapy in asking non-directive questions like: “How does it feel to hate your father?” or “Are you unhappy often?”, where the italicized text was filled in by ELIZA based on previous text from the user. ELIZA was meant as a parody of human communication, but users found it surprisingly valuable.
It’s hard to do chatbots right, because they need to incorporate the diversity of human responses yet not say anything nonsensical or damaging to their users. ELIZA was developed with carefully-constructed rules, and was thus very restricted but sensitive in its responses. These days, artificial intelligence systems are often much more flexible, as they import vast amounts of data from the internet and learn how to do their tasks without hand-coded rules. This means that when a user messages a chatbot, the chatbot can scan all of the previous conversations it has seen and deliver a response that is similar to what it’s seen before. However, with increased flexibility comes increased danger— for example, Microsoft’s chatbot Tay learned to be racist from Twitter users within hours.
One of the most famous modern-day chatbots, Woebot, uses scripted responses to ensure that its users aren’t harmed from learned conversations. Woebot is currently integrated into Facebook Messenger, and in a randomized controlled trial participants using Woebot reduced their depressive symptoms significantly more than participants who read a self-help eBook. Chatbots have a lot of latent potential because they can be highly scaled at a low cost. This is especially important for patients who don’t have access to traditional mental health care as a result of living in rural areas without nearby facilities, who have disorders which are heavily stigmatized, who can’t afford care, or who can’t take the time off from work to attend in-person appointments. On the other hand, humanity’s best researchers are not close to creating a chatbot with human-level conversational skills, since solving “human language” is equivalent to “creating a general artificial intelligence.” Current chatbots may play assistive roles to therapists, providing an on-call resource to patients before in-person sessions, as people can find them quite helpful, but their current conversational abilities are limited.
Natural language processing is a broadly-applicable tool that has large potential impacts for the field of mental health. NLP is already being deployed in diagnosing possible mental health conditions from social media, as well as being used as a therapy-assistance tool. NLP also has potential to extract lessons that will be useful for other therapists, especially with increased data. Outside of psychotherapy, NLP can be used to comb through doctors’ notes for information on whether specific drugs would be good for a patient, a potentially life-saving tool. Psychiatrists and psychologists could also use NLP to highlight important information from the voluminous medical literature. This would help clinicians keep up to date on the best treatments and practices, which can be difficult to manage given the daily requirements of providing care.
However, NLP systems in the future will have to carefully navigate privacy and data security. One of the reasons why NLP systems have not yet been widely applied to psychotherapy is the lack of available data, partly for reasons of patient confidentiality. (Additionally, most psychotherapy has been done in-person, and transcribing sessions is labor-intensive.) CrisisTextLine’s current policy is to release data only to researchers who can state how their research proposal will help future people in crisis and who can come to their headquarters in person. Moreover, CrisisTextLine decided to release anonymized data only when they had enough text conversations to ensure that individual people couldn’t be identified, according to the CEO Nancy Lublin. This is an example of a strategy that NLP systems might adapt in the future, but further concerns about privacy and security will undoubtedly arise.
Current NLP is not near the performance of a professional therapist, or even a listening friend. Siri can’t interpret your significant pause, or grasp the intricacies of your internal mental state. But NLP can be applied to questions that are at too big a scale for humans to answer alone (questions such as, who needs help right now? how can we support everyone?) and that benefit from conglomerating experience across many people (when do successful therapists say encouraging comments, and is it helpful in most cases?). Keep an eye out for NLP— it is poised to be important for scaling good mental health care, for tackling the age-old problem of getting help to people whenever and where it is needed.
Featured image: Hand holding a phone.
Abbe, A., Grouin, C., Zweigenbaum, P., & Falissard, B. (2016). Text mining applications in psychiatry: a systematic literature review. International journal of methods in psychiatric research, 25(2), 86-100.
Alhanai, T., Ghassemi, M., & Glass, J. (2018). Detecting Depression with Audio/Text Sequence Modeling of Interviews. In Proc. Interspeech (pp. 1716-1720).
Althoff, T., Clark, K., & Leskovec, J. (2016). Large-scale analysis of counseling conversations: An application of natural language processing to mental health. Transactions of the Association for Computational Linguistics, 4, 463.
Calvo, R. A., Milne, D. N., Hussain, M. S., & Christensen, H. (2017). Natural language processing in mental health applications using non-clinical texts. Natural Language Engineering, 23(5), 649-685.
Fitzpatrick, K. K., Darcy, A., & Vierhile, M. (2017). Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): a randomized controlled trial. JMIR mental health, 4(2).
Hassanpour, S., Tomita, N., DeLise, T., Crosier, B., & Marsch, L. A. (2018). Identifying substance use risk based on deep neural networks and Instagram social media data. Neuropsychopharmacology, 1.
Hoermann, S., McCabe, K. L., Milne, D. N., & Calvo, R. A. (2017). Application of synchronous text-based dialogue systems in mental health interventions: systematic review. Journal of medical Internet research, 19(8).
Pace, B., Tanana, M., Xiao, B., Dembe, A., Soma, C., Steyvers, M., & Imel, Z. E. (2016). What about the words? Natural language processing in psychotherapy. Psychotherapy Bulletin, 51(1), 17-18.