Overview
Mental health support isn’t always available when people need it most. During moments of crisis, the gap between needing help and accessing it can feel insurmountable. ELI (Empathetic Listening Interface) is a voice-enabled AI mental health companion designed to provide immediate, empathetic support during anxiety attacks and depressive episodes. By combining natural voice interaction with therapeutic techniques, ELI offers a bridge between crisis moments and professional care.
The bot serves as an always-available first responder, offering personalized support through voice interaction that feels natural and comforting. It provides immediate assistance through breathing exercises, grounding techniques, and emotional support while maintaining strict privacy standards.
The Challenge
Mental health crises don’t operate on a schedule. Traditional support systems, while valuable, have inherent limitations:
I was contracted to create a solution that could provide immediate, consistent support while maintaining the empathetic quality of human interaction.
Note: The ElevenLabs platform still has bugs. I have a support ticket out as the AI will often not start the call appropriately. If you select “Call AI Agent” and it defaults to listening mode, it’s having problems. If it opens by asking how you’re doing, it’s working.
In-depth interviews with five individuals who experience anxiety and depression revealed crucial insights about crisis support needs. Key findings included:
I really liked that it would tell you about a breathing or meditation exercise, then actively walk you through it if you want. Personally, I feel like this is one of the most useful features
Collaboration with healthcare professionals, including a medical doctor and a behavioral therapist, helped shape the bot’s therapeutic approach. Their input was crucial in:
I like that it’s starting a new session every time and not saving anything, even though it can be nice to have it remember things. One of my biggest gripes with how AI is being developed right now is reckless handling of copyrighted and personal information, so I appreciate that it had a good answer for that.
Research insights guided the development of ELI’s personality and voice characteristics:
I like the specific exercise such as breathing together.
The conversation architecture was designed to:
⬆️ Big image. Will open in a new tab.
I like that it validates the emotions that are expressed in its responses. Sometimes people need that so badly.
The development process revealed important insights about voice AI technology:
I felt like I was almost talking directly to an actual person.
Privacy and safety were paramount in ELI’s design:
Initial user testing showed promising results across key metrics:
Areas identified for improvement included:
The responses it gave were very natural and used a lot of “affirming” words, by acknowledging the things I said and asking good follow up questions. Much like if I went to an actual therapist/counselor!
The next phase of development focuses on:
It gave me a number for a crisis text line, and the way it read it was a little off and made it hard to understand.
With the breathing exercises, it doesn’t know when to pause and let you breathe. Same with the meditations. So it would just keep going and talking without giving you the time to complete the actual exercise.
I couldn’t tell if it was saying 17 or 70, and when I asked it to repeat the number, it didn’t slow down or read it any differently.
ELI started with a clear mission: be there when other mental health support isn’t available. Through user research, testing, and collaboration with mental health professionals, we created a voice AI companion that people actually trust and use in their moments of need. The project showed that with the right approach, AI can meaningfully complement traditional mental health support while maintaining strong privacy and ethical boundaries.
Key achievements include:
While ELI continues to evolve, the core lesson stands: thoughtful AI implementation can help bridge critical gaps in mental health support. The next phase focuses on expanding personalization and refining interactions, always keeping our main goal in focus – providing compassionate, accessible support when it matters most.
© 2025 Corey Nelson UX, Product Conversation Designer