
Editors’ note: This piece is from Nonprofit Quarterly Magazine’s winter 2024 issue, “Health Justice in the Digital Age: Can We Harness AI for Good?”
Amara sits on her couch in the heart of Harlem, staring at her phone in disbelief. Gratitude? The mindfulness app she’s been using is suggesting her struggle with workplace racism is just stress, and that some thankfulness might be in order. But Amara knows better.
As both a Black woman and a clinical social worker, Amara spends her days helping other Black women navigate a mental healthcare system that often fails them. But lately, it’s Amara who could use some support. Years of advocating for patients and battling stress and anxiety from not feeling valued or seen at her workplace has her feeling worn down. Finding a therapist for herself is no small feat—those who truly resonate with her experience don’t accept her insurance, and those who do lack the cultural competence to see her beyond a checklist. Seeking short-term solutions, Amara turned to mindfulness apps, hoping they could offer relief amid the chaos of her workdays.
A colleague had recommended MindfulVibe, an AI-based app designed to help manage anxiety and stress through mindfulness. Cautiously, Amara gave it a try.
At first, MindfulVibe seemed promising. The app checks in daily, offering affirmations like, You are stronger than your struggles and Focus on the positive. But when Amara types, I’m tired of dealing with microaggressions and racism. It’s constant at work, and I feel invisible! into the AI-chat feature, MindfulVibe responds with, Everyone experiences stress at work. Try focusing on gratitude today.
Amara’s anger flares. Her pain is not just workplace stress. The emotional toll of navigating microaggressions and systemic racism can’t be smoothed over with a generic prompt. MindfulVibe sees her feelings as data points that can be matched with a scripted reply. MindfulVibe fails to recognize her experience as a Black woman navigating the double burden of being both a professional and a patient in a system that overlooks her experience—a system that perpetuates disparities in mental healthcare.
This is not the first time that technology has failed her, and she doubts it will be the last. MindfulVibe, like much of the healthcare system, is designed for the masses—people who don’t carry the weight of systemic oppression. For Amara and countless women like her, many mental health apps fall short. They lack the empathy and understanding required to address not just anxiety but also anxiety shaped by structures and systems of injustice.
Amara knows all too well that many of today’s technological innovations, while marketed as tools for better care for humanity, often deepen existing disparities. Behind the sleek interfaces and promises of efficiency lie hidden biases—algorithms that reinforce the same racist, sexist, and capitalist agendas that have marginalized Black communities for centuries. She reflects on how mental health apps like MindfulVibe are no exception. These platforms claim to offer help, but their data sets are trained on populations that rarely include Black experiences. Their algorithms reflect the worldviews of the developers who built them—often White, often male, often disconnected from the struggles faced by women like her. As a result, the recommendations, responses, and solutions they offer tend to feel hollow or irrelevant, reinforcing the idea that healing is a one-size-fits-all process detached from the realities of racism, implicit-bias, and socioeconomic disparities.
She thinks of the women she counsels, who have also turned to apps like MindfulVibe in desperation, only to be met with cookie-cutter solutions. They, too, have been told to Focus on the positive or Just breathe, as if experiencing racism could be washed away with a few deep breaths. But healing doesn’t happen in a vacuum; real healing requires justice, empathy, and love—none of which MindfulVibe seems capable of delivering.
Amara understands that true innovation cannot simply focus on symptom management—it must address the lived realities shaping emotional and mental wellbeing.
Amara sees that this bias isn’t just an oversight—it’s a symptom of a larger problem in how we define innovation. Capitalist incentives drive mental health startups to prioritize scale, speed, and profit over depth, compassion, and justice. These mindfulness apps commodify wellness, selling mental health as a product while ignoring the systemic inequalities that contribute to poor mental health in the first place. In this framework, resilience is framed as the individual’s responsibility to cope rather than a collective effort to dismantle the structures that perpetuate harm. To Amara, true innovation must not only provide users with tools for personal healing but also confront the very systems that cause their distress. It must be an act of care, love, and justice, and create spaces for Black women to shape their own futures.
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
By signing up, you agree to our privacy policy and terms of use, and to receive messages from NPQ and our partners.
Questions begin forming in Amara’s mind: What if AI could be different? What if it didn’t just treat symptoms but also understood the deeper causes of mental health challenges? What if it could embody the justice, empathy, and love that MindfulVibe lacks?
As her vision for a new kind of AI mental health tool takes shape, Amara understands that true innovation cannot simply focus on symptom management—it must address the lived realities shaping emotional and mental wellbeing. The system isn’t broken by accident; it was built to serve certain groups while excluding others. For Amara, advancing justice through innovation means designing technology that doesn’t just reflect the experiences of marginalized communities but actively works to dismantle the oppressive systems that harm them. Her AI wouldn’t just send affirmations or encourage mindfulness practices; it would recognize the mental toll of systemic racism, generational trauma, and inequality, and offer responses that validate lived experiences and guide users toward meaningful self-empowerment.
This kind of AI would refuse to perpetuate the bias frameworks that dominate tech development. Instead of framing the individual as the problem to be fixed, it would shed light on how societal systems—like workplace discrimination, healthcare inaccessibility, gender inequality, and economic barriers—are key contributors to mental health decline. By incorporating the lived experiences of Black women and communities into its design, this technology would amplify voices that are often silenced, holding space for healing rooted in justice and collective resilience.
When users report workplace discrimination, the AI wouldn’t simply suggest that they focus on gratitude—it would validate their experience and provide resources for addressing microaggressions or seeking legal support. Instead of generic affirmations like Focus on the positive, the AI could respond with, Your experience of feeling invisible at work is valid. Here are some strategies for addressing microaggressions and advocating for yourself in hostile environments, and some resources on workplace rights. Just know you are seen, heard, and valued while you overcome this experience.
More than just a tool for personal coping, Amara’s AI app would encourage collective action. It would also connect users to mutual aid networks, financial literacy resources, and advocacy groups.
For Black women struggling with the toll of systemic racism, the AI could offer empathy-oriented responses and direct users to a directory of racial-healing circles, trauma-informed therapists centering Black women’s experiences, and community-led movements for justice. To ensure its exercises resonate with this community, the app could feature guided meditations and affirmations that reflect the ancestral trauma and the social and cultural norms of its users. For instance, a guided visualization could center on embracing cultural heritage by inviting the app’s users to connect with the wisdom and resilience passed down through generations.
When a user struggles to find a therapist who understands their racial and cultural background, the app could recommend culturally competent healthcare providers. It could say: It can be hard to find therapists who truly understand your lived experience. Here’s a directory of mental health professionals who specialize in serving Black women and accept sliding-scale or low-cost insurance options. It could also offer guidance on navigating healthcare systems that perpetuate exclusion or bias.
More than just a tool for personal coping, Amara’s AI app would encourage collective action. It would also connect users to mutual aid networks, financial literacy resources, and advocacy groups, empowering them to address the broader socioeconomic forces contributing to their mental health. By linking individual healing with societal change, this AI would offer trauma-informed, justice-driven solutions rooted in resilience, advocacy, and empowerment.
Amara’s vision is clear that technology should not only reflect lived experiences but also challenge the oppressive systems shaping them, transforming innovation into a tool for equity, justice, and healing. We must create technologies that serve not just efficiency but also equity, she thinks. Amara closes her eyes, feeling the weight of the day ease from her shoulders. MindfulVibe may have failed her, but the AI she envisions will not. It will reflect her and her community, and hold their pain with the care and understanding they have always deserved.
This is the future of innovation: justice in action. And it starts with us.