Center for MH in Schools & Student/Learning Supports  

 

Hot Topics
Information and resources on topics of current interest

***********************************

Ongoing Hot Issues
Ongoing issues relevant to MH in schools and addressing barriers to learning and teaching

The Promise and Peril of AI in Mental Health

Artificial intelligence (AI) is rapidly transforming the mental health field, with growing applications in therapy, research, and education. A recent report states that about 80% of mental health apps currently available utilize some form of AI or machine learning.

Advocates highlight its potential to make mental health care more accessible and affordable. Tools such as chatbots and mental health apps offer immediate support, assist with daily coping strategies, and even automate administrative tasks for clinicians. An example is Wysa chatbot, which is described in one review as

“a decent tool to help learn and maintain CBT skills and techniques. However, the AI chatbot can feel restricting at times and doesn’t always respond correctly in a conversation. Wysa may be a great fit for some, but those who need therapy and those who do not like chatbots will need to seek out an alternative.”

Such examples demonstrate that AI can be a valuable tool when deployed thoughtfully, with well-defined objectives, and guided by human oversight.

At the same time, these advancements do come with significant technical risks (bias, misinterpretation) and ethical risks (privacy, autonomy). A 2025 study by Stanford University, Exploring the Dangers of AI in Mental Health Care, revealed troubling functioning among therapy chatbots. AI tools used in health care have discriminated against people based on their race and disability status, and some have exhibited bias against individuals with conditions such as schizophrenia or substance use disorders. More unnervingly, bots failed to recognize or appropriately respond to suicidal ideation. In one case, a chatbot responded to a user expressing suicidal thoughts by listing bridge heights—an example that underscores the dangers of entrusting sensitive emotional issues to automated systems.

Additional concerns include breaches of privacy and the dissemination of inaccurate or harmful advice, particularly to vulnerable populations such as adolescents. These findings highlight the urgent need for oversight, transparency, and ethical safeguards in the development and deployment of AI in mental health contexts.

We all can play a role in ensuring the responsible use of AI. Those with expertise in ethics, human behavior, and emotional well-being are in positions to identify biases, uphold privacy standards, and advocate for equitable and safe AI practices. As AI continues to evolve, its integration into mental health care must be guided by rigorous research, multidisciplinary collaboration, and a commitment to safeguarding human dignity.

References

Abrams, Z. (2023). AI is changing every aspect of psychology. Here's what to watch for. Monitor on Psychology, 54(5), 46. American Psychological Association. https://www.apa.org/monitor/2023/07/psychologists-artificial-intelligence

Andoh, E. (2025). Many teens are turning to AI chatbots for friendship and emotional support. Monitor on Psychology, 56(7). American Psychological Association. https://www.apa.org/monitor/2025/10/technology-youth-friendships

Gitnux Report (2025). AI in the mental health industry statistics. Author. https://gitnux.org/ai-in-the-mental-health-industry-statistics/

Stanford University. (2025). Exploring the dangers of AI in mental health care. Stanford Institute for Human-Centered AI. https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care?utm_source=chatgpt.com

We look forward to hearing from you.

Send your responses to Ltaylor@ucla.edu


This Hot Topic was initially drafted by Meeneh Mirzaians, a UCLA student participating with the Center, and then edited by the Center staff (with feedback from AI).



To see previous Hot Topics, click here

Continue on to our
Ongoing Hot Issue
Previous Hot Topics:

Want more information?
Want to connect?
Want to be on our mailing list?

Contact

Linda Taylor (ltaylor@ucla.edu)