A collaborative team from Stanford, Carnegie Mellon, and Georgia Tech has created an artificial intelligence model that provides feedback to novice peer counsellors.

In a study accepted for the 2024 Association for Computational Linguistics conference, researchers have unveiled an AI-based model designed to improve the training of peer counsellors. This innovation comes at a time when nearly one in five American adults struggle with mental illness, creating a surge in unmet demand for support.

The project, supported by the Stanford Institute for Human-Centered AI, brings together expertise from both computer science and psychology. Stanford computer scientist Diyi Yang and psychologist Bruce Arnow led the interdisciplinary effort.

Dr. Arnow emphasised the importance of this collaboration: "AI has enormous potential to help improve both the quality and efficiency of psychotherapy training, but the mental health community is not well equipped to develop an AI-assisted training model and the computer science community is not grounded in counselling intervention skills."

The AI model's feedback framework, developed with input from three Stanford psychotherapists, provides three crucial pieces of information: a clear definition of the counsellor's objective in the conversation, suggestions for improving the counsellor's response, and a specific suggested response aligning with the conversation's goal. To create the dataset, the researchers collected feedback from 400 emotional support conversations, co-annotated by GPT-4 and domain experts. The model incorporates an innovative self-checking mechanism to minimise poor feedback, and human experts have reviewed its output, confirming its value in coaching peer counsellors with limited formal training.

Alicja Chaszczewicz, a Ph.D. student in computer science at Stanford and co-author of the paper, explained the model's potential applications: "We're trying to mimic this one part of it and, that way, provide both a pedagogical tool and a practical tool to support organisations that don't have enough instructors to give their counsellors feedback."

The researchers envision several potential uses for the AI model. In educational settings, it could enhance supervision where instructors face challenges providing detailed feedback on every counselling conversation. It could also create a "safe sandbox" where novice counsellors can practise with AI-generated patients and receive feedback without privacy concerns. Additionally, it offers a scalable training resource for organisations with limited resources for counsellor training.

Ryan Louie, a postdoctoral researcher at Stanford and co-author of the paper, emphasised that the goal is not to replace clinical supervision but to provide a complementary tool: "We are by no means attempting to replace the clinical supervision process, which is very complex."

As mental health support becomes increasingly critical, this AI-powered approach to peer counsellor training could significantly enhance the quality and accessibility of emotional support services. By leveraging artificial intelligence to provide targeted feedback and practice opportunities, this innovation has the potential to help address the growing demand for these crucial services.



Share this post
The link has been copied!