
In late December 2024, New York State Senator Andrew Gounardes’s legislation to protect domestic violence survivors was signed into law. Focused on the intersection of tech and domestic violence advocacy, the law is an attempt to ensure that abusers can no longer use tracking technology installed in vehicles. The legislation comes as law enforcement struggles to address the evolving role of technology in domestic violence cases.
Yet, beyond legal measures, a lack of awareness and resources continues to leave survivors vulnerable. The World Health Organization (WHO) estimates that all women worldwide have experienced some form of domestic violence in their lifetime. In the United States, 3 in 10 women (29 percent) and 1 in 10 men (10 percent) report enduring rape, physical violence, and/or stalking by a partner, with significant effects on their wellbeing.
Despite the prevalence of abuse, only 2.5 percent to 15 percent of these cases are reported, often due to stigma, shame, financial constraints, or immigration status. Survivors also face barriers to accessing online resources and financial independence. Research suggests that financial abuse occurs in 99 percent of domestic violence cases, further limiting access to resources.
In response to these barriers, technology activists in the United States and worldwide are leveraging generative artificial intelligence (AI) and chatbots, even virtual reality (VR), to provide accessible, innovative solutions to help survivors where traditional resources may fall short.
AI Chatbots in Survivor Support
Anne Wintemute cofounded Aimee Says, an online abuse awareness platform whose main feature is Aimee, an AI chatbot designed to help survivors recognize abuse and guide them on what steps to take next. With such high rates of abuse in the United States alone, Wintemute and her cofounder Steven Nichols created Aimee as a way to reach domestic violence survivors unable to get through to overburdened human services and resources.
“There are advocates who are amazing with some people and some who don’t do so well with others,” Wintemute tells NPQ. “We got feedback just today,” she adds, “where someone said, ‘I prefer dealing with Aimee over advocates.’ Advocates can be rushed or not connected enough, even though none of these things are intentional.”
“So many people…never seek formal services from a domestic violence provider [because] they don’t want to talk to someone.”
And Aimee isn’t the only AI chatbot in the domestic violence realm. DomesticShelters.org, a program of Alliance for HOPE International, recently launched Hope, another chatbot aimed at guiding survivors through identifying abuse and taking actionable steps.
Ashley Rumschlag, National Director of DomesticShelters.org, explains that they created this chatbot to support the countless people who come onto their website but don’t reach out for help.
“So many people…never seek formal services from a domestic violence provider [because] they don’t want to talk to someone,” Rumschlag tells NPQ, pointing out that the culture of “victim blaming” that is common in the United States has a lot to do with their reluctance. And there are many people, Rumschlag notes, that think, He hasn’t hit me yet, so it’s not abuse. But DomesticShelters.org wants people to be able to validate their own feelings and overcome a sense of It’s not really happening.
So, why AI? Survivors often fear judgment, retaliation, or burdening others. Many hesitate to disclose abuse even to close friends, let alone advocates. AI offers anonymity, non-judgmental guidance, and round-the-clock availability.
“When someone talks to AI [about] what they feel, they have all the attention, plus they don’t feel guilty as it can be difficult for survivors to lean on someone because they feel guilty for taking those resources,” Wintemute says, adding, “whereas with AI you can say anything, and it’s never going to say ‘My kids shouldn’t come over,’ or ‘I’ll call your mom.’”
Of course, there’s the added issue of possibly false or misguided information when it comes to using tech interventions. Wintemute agrees that no system is foolproof that way. “We have had users who say Aimee got [the] phone number to this place wrong…so we’ll go in and fix that,” she says, adding that if users allow them to access the chat, they can go in and see if the information is okay.
Technology as a Complement, Not a Replacement
While AI provides a valuable first line of support, it shouldn’t replace human resources.
Rhiana Spring, founder of Spring ACT, created Sophia, the world’s first AI chatbot for abuse survivors, emphasizing that AI should work alongside human advocates.
“Sophia isn’t better [than humans],” Spring tells NPQ, “it’s filling a gap. It’s a support, an additional resource for [domestic violence] support services.…Sophia’s aim is to bring the survivor to a support service and to gather evidence.”
She also points out that chatbots like Sophia help streamline and improve the quality of support that activists and advocacy organizations can provide: “So many NGOs are underfunded, and so technology helps support them by automating the resources survivors have. We improve resources by increasing time spent [with human advocates] qualitatively, not quantitatively.”
A lack of awareness and resources continues to leave [domestic violence] survivors vulnerable.
Spring has worked with vulnerable communities worldwide and views AI as a scalable, cross-cultural tool. She notes that as of February, Sophia now speaks more than 85 languages, making survivor support more inclusive for those facing language barriers. And Spring ACT works with activists and advocates within the domestic violence space in the countries where Sophia is available in order for them to provide the local resources and cultural nuance to Sophia that survivors in that region may need.
Virtual Reality for Empathy and Rehabilitation
AI isn’t the only tech being used in domestic violence advocacy. French startup Reverto uses virtual reality (VR) to help create empathy and conduct behavior training for people who have engaged in abuse.
Originally, the company developed VR simulations to help companies educate people on sexual harassment. Guillaume Clere, president of Reverto, shared that the government eventually reached out to them to create VR situations to help rehabilitate abusers reentering society.
Sign up for our free newsletters
Subscribe to NPQ's newsletters to have our top stories delivered directly to your inbox.
By signing up, you agree to our privacy policy and terms of use, and to receive messages from NPQ and our partners.
“Before they come back [from prison], there’s a program with psychologists that aims to make them understand what survivors went through to develop empathy, but they didn’t have the tools they needed. They did role play, but it’s very difficult to do role play with abusers because most of them don’t have cognitive empathy,” Clere tells NPQ.
Using VR headsets, Clere shifted abusers from imagining being in someone else’s shoes to experiencing domestic violence from the survivor’s perspective, forcing them to confront the emotional and psychological damage they inflicted.
Though Reverto is currently only available in France, the concept raises important questions for the United States about criminal justice reforms and court-ordered rehabilitation programs, especially at a time when, as Wintemute points out, family courts don’t necessarily favor women looking to leave their abusive partners.
Navigating Digital Challenges
While AI and tech solutions hold promise, using them to fill in the gaps in domestic violence advocacy comes with its own kinds of challenges, particularly around privacy and survivor safety.
In 2024, multiple US states came up with legislation that furthered protection for survivors of domestic violence. Twenty states allow survivors to take what is known as “safe leave” or “safe time”—the right to time off in relation to sexual and domestic violence.
Under the New York-based Melanie’s Law, all family and household members of the survivor will be afforded the same process in court, including the ability to obtain an order of protection. And last year, California’s governor signed several new gun control measures, which expanded the state’s restrictions on gun ownership and introduced new protections for survivors of domestic violence.
As Wintemute notes, navigating security issues with online resources in the domestic violence space is complex, but the first concern should be the physical security of the survivors themselves.
While AI provides a valuable first line of support, it shouldn’t replace human resources.
“A lot of guidelines around security are not great. Calling 911 is not always a good option, so we’ve trained Aimee to identify the emergency system available and which route is best—and then try to help the survivor emotionally overcome those barriers,” says Wintemute. In other words, if a survivor were to tell Aimee they couldn’t call 911, she’d ask them if they could call someone else.
Data, privacy, and anonymity are other big issues. Rumschlag shares that their model doesn’t allow users to store data anywhere: “We’re not allowing users to store the chat. Once they leave the website and end the chat, they cannot store [it] anywhere on their device.”
Similarly, Wintemute points out that the team behind Aimee cannot read any of the chats Aimee is having with those reaching out to her. “If someone wants extra support and they reach out to me and give me access to their account, only then can I see it,” she says.
Most sites like Aimee Says and DomesticShelters.org also have quick exit buttons that allow users to escape the site immediately if their abuser enters the room.
As Spring notes, privacy is the foremost focus: “Privacy and survivor safety is at the absolute core of everything we do. I’m an HR lawyer, so when it comes to the ultimate decision of the organization, I’m always saying survivor safety first.”
However, these privacy policies present funding challenges, limiting the platform’s ability to scale because they’re often unable to show the donors the impact on the same scale as other organizations.
Expanding Advocacy Through Tech
Still, these organizations continue working with advocates to bring their chatbots and resources to more people who need them. Spring encourages volunteers to get on board with Spring ACT and other such initiatives, helping to raise awareness around tech solutions and abuse in general.
For many advocates and service providers, Wintemute shares, Aimee has allowed an additional form of care for the survivors they are helping. “Service providers are a finite resource. Time with them is limited, so they can send survivors home with Aimee to provide continuity of care,” she says.
Wintemute also points out that Aimee is available to any domestic violence resource organization to embed on their website as an additional way to help any survivors who may be looking for resources.
By investing in chatbots like Aimee and Sophia, who can cater to survivors—and who speak multiple languages—organizations can also overcome cross-cultural barriers.
“Tech should be there to lower the barrier [for] to human support and help with overburdening and underfunding,” Spring says.
With domestic violence remaining an endemic, urgent global issue, solutions like AI and other tech-driven tools offer new hope. Though they are only the first step in addressing the problem, by combining innovation with survivor-centered advocacy, advocates may be able to reach more people, break cycles of abuse, and build safer futures for all.