Brief
Mentation is an AI-powered mental health companion built to reduce stigma and provide accessible, affordable support. It bridges the gap between self-help and professional care through journaling, analytics, and an empathetic chatbot.
The Problem
"In India, where 85% of individuals with mental health concerns never seek professional care" – WHO
Mental health remains taboo, especially in India. Mentation addresses this by offering:
- Culturally intelligent AI support
- DSM-5 aligned emotional guidance
- Stigma-free escalation to professionals

Goal
To design and validate an AI powered mental health companion that reduces stigma, improves accessibility, simultaneously learns about you (behaviour, and pattern) and provides clinically grounded emotional support for users, while seamlessly bridging the gap between self-help and professional care.
Secondary Research
Stigma:
- WHO (2021): India has only 0.75 psychiatrists per 100,000 people, far below the global average.
- National Mental Health Survey (2016): 80% of Indians with mental health issues avoid treatment due to stigma and cost.
Tech Adoption:
- NIMHANS (2021): Most digital mental health tools lack regional language support and cultural sensitivity.
- McKinsey (2022): Mental health app downloads surged 300% during COVID-19, but retention remains low.
AI in Mental Health:
- Nature Digital Medicine (2021): AI tools can bridge care gaps but require clinical oversight for safety.
- Harvard Business Review (2022): Culturally adapted AI chatbots reduce burden on professionals.
Primary Research
Interviews & Surveys
Methodology:
- User Interviews (16 participants, 18–35 years, urban India)
- Survey (150 respondents)
- Expert Consultation (Clinical psychologists)
Key Insights:
- 62% avoided therapy due to societal judgment.
- 78% cited cost as a barrier to professional help.
- Users value journaling and reflecting on their past thoughts and actions. 55% used journaling or self-help apps but wanted more personalised support.
- Privacy is a top concern. 65% worry about data security in mental health apps.
- Users need motivation and visible metrics. 57% drop off journaling apps within a week or two due to lack of insights and support.

Psychologist Consultations
Collaborated with 3 mental health professionals to:
1. Identify red flag emotions (e.g., prolonged sadness, anxiety spikes).
2. Validate conversational flows for therapeutic effectiveness.
Cultural Context
- Users preferred Hindi/regional languages (especially Hinglish) for deeper emotional expression.
- Privacy was critical: "My family can't know I'm struggling" (quote from a 24-year-old participant).
User Persona
Anika | 22
College Student (MUMBAI)
Goals: Achieve a structured system to analyse stress and emotional bouts and improve mental well being.
Frustrations: Anxiety about study inconsistency, often journals but loses track.
Tech Use: Comfortable with apps and explores many of them to fit varied aspects of her life.
Rohan Shah | 28
IT Professional (INDORE)
Goals: Balance work and personal life pressure, and find time-efficient support.
Frustrations: Feels that therapy is expensive, and that his schedule is unpredictable.
Tech Use: Comfortable with apps but wary of privacy violations and data leaks.
Priya Sharma | 35
House Wife (BHOPAL)
Goals: Seek emotional support without family or anybody else's scrutiny.
Frustrations: Unable to share her feelings and feels lonely and cornered at times.
Tech Use: Not very comfortable with exploring complicated apps but is well familiar with social media.
Arav | 25
IT Professional (MUMBAI)
Goals: Vent and manage excess stress and emotional baggage and get a hold of them.
Frustrations: Stress from work, fear of judgment, lacks time for therapy.
Tech Use: Loves exploring and building new apps, but is a little tired of too many apps for too many things.
User Journey Map
Anika's Journey | 22
Over long term regular use of mentation.
Aarav's Journey | 25
Over immediate and repeated use of mentation.
Key Features
Prototype & Testing
Initial Prototypes:
Developed low-fidelity wireframes to validate core user flows.
Iterated on chatbot interactions to ensure empathetic and natural-sounding responses.
Designed A/B tests for journaling prompts and analytics UI.
Conducted remote usability testing with 12 participants.
Findings:
- 85% found the chatbot engaging but suggested more varied responses.
- 70% wanted better visual representations of mood trends.
- 60% were concerned about long-term data security.
Iterations Based on Feedback:
- Refined chatbot responses using diverse NLP training datasets.
- Enhanced analytics UI with easy-to-understand mood graphs.
- Strengthened security measures by implementing end-to-end encryption.
Challenges
Challenge: We needed to balance privacy with personalisation.
Solution: Onboarding asks for minimal data; users control what's stored.
Challenge: Balancing AI Assistance with Human-Like Empathy, and Hindi NLP accuracy.
Solution: Ensuring Lana felt supportive and not robotic. Fine-tuned AWS Bedrock's model with Hinglish and Hindi datasets.
Challenge: User Drop-off — Users often abandon journaling apps.
Solution: Added gamification and tuned prompts for human-like empathy and engagement.
Challenge: Frictionless user flow.
Solution: Simplified onboarding and navigation to reduce cognitive load and improve accessibility.
Challenge: Hackathon timeline was compressed; we had to rapidly prototype while maintaining quality.
Challenge: AWS Bedrock was down for 3 days (Nov 9–12), delaying testing and improvements.
Results & Impact
Well-functioning therapy agent: Users said Lana felt human, nurturing, and safe to open up to — never terminating conversation abruptly.
POC Launch: 60 users, 80% retention at day 50.
Future Roadmap
- AI Twin: Adaptive to user over time, shareable with trusted contacts.
- Language Expansion: Hindi & regional languages.
- Reports: Therapist-ready emotional summaries.
