AYA

Created an AI-powered mental health tool that meets Gen Z where they are, fostering healthier practices while reinforcing, but never replacing, the role of human therapists.

Role

Product Designer

Product Designer

Tools/Skills

Figma, Miro, AI, Qualtrics, Procreate

Figma, Miro, AI, Qualtrics, Procreate

Team

Solo Project

Solo Project

Duration

12 weeks (2025)

PROBLEM

AI is no longer just a tool for convenience, but a confidant, a first stop for mental health support, especially for younger generations

From 2024 to 2025, Harvard Business Review reported that “therapy/companionship” rose from the #2 to the #1 most common use case for AI. The Wall Street Journal highlighted “Sonny,” an AI chatbot being adopted by school districts to meet rising demand for student mental health support. On social media and Reddit threads, Gen Z openly discusses turning to ChatGPT for therapeutic conversations.

This trend reflects both opportunity and risk. On one hand, AI can help address the shortage of mental health providers and offer immediate, accessible support. On the other, over-reliance on unregulated chatbots risks harmful outcomes and undermines the nuanced care only human therapists can provide.

From 2024 to 2025, Harvard Business Review reported that “therapy/companionship” rose from the #2 to the #1 most common use case for AI. The Wall Street Journal highlighted “Sonny,” an AI chatbot being adopted by school districts to meet rising demand for student mental health support. On social media and Reddit threads, Gen Z openly discusses turning to ChatGPT for therapeutic conversations.

This trend reflects both opportunity and risk. On one hand, AI can help address the shortage of mental health providers and offer immediate, accessible support. On the other, over-reliance on unregulated chatbots risks harmful outcomes and undermines the nuanced care only human therapists can provide.

SOLUTION

Meet AYA

A tool that leverages AI as a bridge to healthier mental health practices while reinforcing, rather than replacing, the role of human therapists.

PRIMARY RESEARCH

Turning to my own community to understand how people are really interacting with AI in the context of mental health

I surveyed 15 people about their experiences with AI for mental health. Only 5 participants reported feeling comfortable sharing personal thoughts with AI, while 67% expressed distrust, especially regarding sensitive topics. Yet all acknowledged that AI is here to stay and emphasized the need for full transparency when interacting with these tools. Among the 5 who do use platforms like ChatGPT to process emotions, none considered it a replacement for human therapy, citing convenience, fear of judgment, and affordability as their main reasons.

SECONDARY RESEARCH

Understanding the broader context and potential of AI in mental health

COMPETITIVE ANALYSIS

Uncovering market gaps

Competitors like ChatGPT, Abby, TherapyAI, and Myentries often encourage ongoing AI reliance instead of connecting users with real therapists. Lack of clinical oversight and design for safety, combined with engagement-driven incentives, can create inconsistent experiences and even promote misuse.

DESIGN GOALS

Build a tool that empowers users to seek real connection, not replace it

PERSONAS & THEIR JOURNEY

Imagining the Gen Z user

Chloe is a bright and ambitious 22-year-old university student majoring in graphic design. She's passionate about her art and very social, but often finds herself overthinking social interactions and personal relationships, leading to anxiety. She's proactive about her mental well-being and is open to tools that help her process her emotions.

"I wish I could just hit pause on my brain sometimes and figure out what I actually want to say."

Frustrations/Pain Points:

  • Difficulty expressing her feelings in the moment.

  • Fear of conflict or misunderstanding with friends.

  • Sometimes feels overwhelmed by academic and social pressures.

USER JOURNEY

Placing the support-seeking persona in a context scenario helps identify the required actions the platform needs to support

The user will be able to access session summaries of key highlights and important topics to discuss after each conversation with the support model, which they then will have the option to share and discuss notes prior to the session with their chosen therapist/counselor.

WIRE-FRAMING & USABILITY TESTING

Understanding how users engage with AI support

FINAL OUTCOME

A refined AI journaling tool that encourages emotional reflection while guiding users toward real human support

SOLUTION HIGHLIGHTS

AYA evolved into an AI companion designed with empathy, boundaries, and purpose, creating a bridge between digital comfort and human connection

Mindful Onboarding

A simple and engaging start that only asks the most essential questions, ensuring the process feels easy and intentional.
Mindful Onboarding

A simple and engaging start that only asks the most essential questions, ensuring the process feels easy and intentional.

Mindful Onboarding

A simple and engaging start that only asks the most essential questions, ensuring the process feels easy and intentional.

Journaling & Chat

A space to log thoughts and feelings that carry into future sessions, helping users feel prepared and capturing the emotions they want to remember for their therapist. To prevent overreliance, gentle blockers are built in to discourage excessive use, and all advice emphasizes the importance of seeking real-life support.

Journaling & Chat

A space to log thoughts and feelings that carry into future sessions, helping users feel prepared and capturing the emotions they want to remember for their therapist. To prevent overreliance, gentle blockers are built in to discourage excessive use, and all advice emphasizes the importance of seeking real-life support.

Journaling & Chat

A space to log thoughts and feelings that carry into future sessions, helping users feel prepared and capturing the emotions they want to remember for their therapist. To prevent overreliance, gentle blockers are built in to discourage excessive use, and all advice emphasizes the importance of seeking real-life support.

Smart Matching & Resources

Matches users with therapists based on their concerns and personal profile, while also offering personalized resources drawn from journal entries and user data.

Smart Matching & Resources

Matches users with therapists based on their concerns and personal profile, while also offering personalized resources drawn from journal entries and user data.

Smart Matching & Resources

Matches users with therapists based on their concerns and personal profile, while also offering personalized resources drawn from journal entries and user data.

TAKEAWAYS

AYA highlights the importance of a healthier relationship with the use of AI tools and emphasizes the need to seek real-life care for well-being

Project outcomes

80% of 10 participants felt that AYA’s limited chat sessions encouraged reflection and prevented over-reliance on AI. 70% of 10 participants found the prompts to connect with friends or therapists supportive and helpful. These insights reinforced the importance of setting clear boundaries for AI interaction and designing prompts that guide users toward real-life human connection. They informed the final iteration of AYA, ensuring the tool empowers users while promoting healthier, more authentic support networks.

Why this matters

Reflecting on my own therapy experiences, I realized how easy it is to over-rely on AI, even when friends and loved ones are available. Cultural factors in my community, including mental health stigma and norms around handling conflict independently, make AI a safe space to share feelings without judgment. Designing for this context requires fostering ethical AI use and encouraging balanced human-AI relationships.

Looking ahead

Working on AYA revealed the tension between perceived empathy and performative support, highlighting the societal and ethical limits of AI-mediated care. These insights inspire my Major Studio 1 thesis, which explores authenticity, trust, and boundaries in human-AI relationships through interactive design. Future iterations could integrate biofeedback from mobile devices or smartwatches to detect stress and offer personalized support, extending human-centered AI care responsibly.

Project outcomes

80% of 10 participants felt that AYA’s limited chat sessions encouraged reflection and prevented over-reliance on AI. 70% of 10 participants found the prompts to connect with friends or therapists supportive and helpful. These insights reinforced the importance of setting clear boundaries for AI interaction and designing prompts that guide users toward real-life human connection. They informed the final iteration of AYA, ensuring the tool empowers users while promoting healthier, more authentic support networks.

Why this matters

Reflecting on my own therapy experiences, I realized how easy it is to over-rely on AI, even when friends and loved ones are available. Cultural factors in my community, including mental health stigma and norms around handling conflict independently, make AI a safe space to share feelings without judgment. Designing for this context requires fostering ethical AI use and encouraging balanced human-AI relationships.

Looking ahead

Working on AYA revealed the tension between perceived empathy and performative support, highlighting the societal and ethical limits of AI-mediated care. These insights inspire my Major Studio 1 thesis, which explores authenticity, trust, and boundaries in human-AI relationships through interactive design. Future iterations could integrate biofeedback from mobile devices or smartwatches to detect stress and offer personalized support, extending human-centered AI care responsibly.

Project outcomes

80% of 10 participants felt that AYA’s limited chat sessions encouraged reflection and prevented over-reliance on AI. 70% of 10 participants found the prompts to connect with friends or therapists supportive and helpful. These insights reinforced the importance of setting clear boundaries for AI interaction and designing prompts that guide users toward real-life human connection. They informed the final iteration of AYA, ensuring the tool empowers users while promoting healthier, more authentic support networks.

Why this matters

Reflecting on my own therapy experiences, I realized how easy it is to over-rely on AI, even when friends and loved ones are available. Cultural factors in my community, including mental health stigma and norms around handling conflict independently, make AI a safe space to share feelings without judgment. Designing for this context requires fostering ethical AI use and encouraging balanced human-AI relationships.

Looking ahead

Working on AYA revealed the tension between perceived empathy and performative support, highlighting the societal and ethical limits of AI-mediated care. These insights inspire my Major Studio 1 thesis, which explores authenticity, trust, and boundaries in human-AI relationships through interactive design. Future iterations could integrate biofeedback from mobile devices or smartwatches to detect stress and offer personalized support, extending human-centered AI care responsibly.

made with hojicha.