In November 2022, a service called ChatGPT was released by OpenAI, an artificial intelligence (AI) research company founded in 2015. ChatGPT taps into a large database of text data from various sources (books, the web, etc.) to produce automated text responses based on what you ask it to do. The company has made this service freely available to the public on their website. In only a few months, ChatGPT has made big waves in a number of industries, so let’s take a look at how behavioral health services can leverage ChatGPT.
What is ChatGPT and Other AI Chat Software?
OpenAI describes ChatGPT as “a cutting-edge AI software that can generate human-like text based on input it receives,” suggesting uses such as customer service chatbots, virtual assistants, and automated messaging systems. ChatGPT is an language system that can write whole sentences or even letters/essays in real time based on a natural-language prompt. The interface has an area for you to type a query of any length, and then ChatGPT’s response appears word-by-word in real time. For example, you could ask ChatGPT to write an essay about the history of psychotherapy. The output it produces will be in whole sentences and typically does not match any exact sentences found in other websites or books. Since it burst onto the scene, ChatGPT has set off alarm bells in high schools, higher education, and other settings, as people can use it to circumvent TurnItIn and other plagiarism-prevention systems without writing an original essay themselves.
In many ways, ChatGPT is like Google and other search engines that accumulate a gigantic corpus of data and then slice and dice it so that it can be ranked and searched. But AI chat software takes it a step further by producing dynamic output that can be tailored to your specific request and is hard to distinguish from genuine, original writing. ChatGPT is currently one of the most promising AI chat software platforms, which has shown in extensive media coverage and a multi-billion dollar investment by Microsoft in January 2023.
Benefits of ChatGPT For Behavioral Health Services
When we asked ChatGPT how it could improve behavioral health services, the software suggested that it can, “automate repetitive tasks, freeing up valuable time for mental health professionals” and that it “can also be programmed to provide mental health assessments, health screenings, telehealth prompts, and other essential services, making mental health care more accessible and convenient for patients.” These sounded like promising functions.
However, we were a bit surprised that ChatGPT went on to claim that it could provide “24/7 mental health care” that could be of particular use to those who lack access to a mental health care provider. This claim is false, as ChatGPT is not a replacement for a trained mental health practitioner, and in fact often presents shallow and incorrect information that may even be inferior to static webpages such as the Mayo Clinic. It was clear, at least as of this writing (early February 2023), that OpenAI has not yet programmed ChatGPT to avoid making false claims about its capabilities. But, they have dubbed the software a “Free Research Preview” and warned all users that the software may, “occasionally generate incorrect information” and even “occasionally produce harmful instructions.”
Nevertheless, ChatGPT may be useful to practitioners in behavioral health for behind-the-scenes tasks such as writing first drafts of checklists, letters to insurance companies, telehealth prompts, and health screenings. Furthermore, AI chat software is only expected to improve in the future.
Cons of ChatGPT
Most notably, ChatGPT is not a replacement for a trained behavioral health practitioner, and even mentioning it to patients runs the risk that it may cause harm. ChatGPT could give incorrect advice or even be chosen over sensible solutions such as the 988 suicide prevention hotline, which could have tragic results for patients. Moreover, another issue is that conversations with ChatGPT are not subject to HIPAA or other patient privacy protections; the company warns that personal or sensitive information should not be entered into the system. OpenAI is admittedly making ChatGPT free to the public to harvest data to be used to train and improve the system, which is incompatible with patient privacy.
Because of the substantial risk for harm to patients as well as violations of their privacy, it is ill-advised to recommend ChatGPT to patients and may even open your practice to various legal liabilities. In addition, depending on what ChatGPT is used for, it may be less useful and accurate than what you would find on a regular, static website. For example, a recent popular press piece lamented that it recommended medication for a disorder that should generally be treated with cognitive behavioral therapy. It may actually be better to search Google for these types of queries (or better yet, ask a qualified practitioner). Although most behavioral health practitioners will quickly spot such misinformation, ChatGPT presents a particular risk to laypersons who will probably not be able to distinguish between accurate and inaccurate information.
How ChatGPT Can Be Used in Behavior Health
Although it is ill-advised to recommend ChatGPT to patients, it can have many uses in behavioral health services. For example, we asked it to write a series of telehealth prompts using this request:
“Write a series of 10 telehealth prompts for a behavioral health practice specializing in young adults.”
What ChatGPT produced could be a useful starting point for practitioners:
- How has your mood been since our last session?
- Can you describe any challenges or stressors you have faced recently?
- Have you been practicing the coping skills we discussed in our previous session?
- Can you share how your relationships with friends and family have been going?
- Have there been any changes in your daily routine or sleep patterns?
- Can you talk about how you have been managing any negative thoughts or feelings?
- How have you been handling any relationship conflicts or communication issues?
- Have there been any changes in your use of substances or other unhealthy coping mechanisms?
- Can you reflect on any positive self-care activities you have engaged in since our last session?
- What would you like to focus on or discuss in this session?
Beyond this, ChatGPT can also be asked to write items such as letters to insurance companies to justify certain treatments, appointment cancellation policies, and any number of other potential items with broad applicability in behavioral health services and numerous other fields. The output it produces tends to be grammatically accurate and follows good writing principles, although sometimes using circular language and lacking depth. However, ChatGPT can be prompted with specific requests and details that may make its output more helpful, and when asked to write letters, it even inserts placeholders for needed information such as your practice’s name.
At the very least, what ChatGPT offers can be a good starting point, particularly for practitioners and staff who lack confidence in writing or experience writer’s block. Furthermore, in the future, it is only expected that AI chatbots will improve as the technology advances, which may make ChatGPT and similar services even more useful in behavioral health services. Eventually, AI chat software may even be used directly in behavioral health to improve mental health accessibility, although at this time, ChatGPT is not safe or efficacious to use in this manner.