October 6, 2015; Boston Globe
The state has a public contract to safeguard the well-being of all citizens, especially the vulnerable. But fulfilling this contract is difficult given an overwhelmed system that has to deal with overlapping issues of poverty, mental health, income and ethnicity. NPQ has reported many times on the shortcomings of the national child welfare system, the enforcement of child welfare laws, and even the controversial practice of rehoming.
Around the world, child welfare authorities come under the gun to solve the guessing game of who to save, especially when tragic negligence hits the media. One example is the 2012 UK case of Daniel Pelka, who died from starvation and abuse even though authorities had contact with the family multiple times. In the U.S., a similar trend of “missing deadly patterns” has been documented by the Austin American-Statesman. A striking case here is Christopher Berry, deemed “lower risk” by social workers despite a history of arrests and violence, who went on to shake his son to death.
In the Pelka case, the social care department stated that it couldn’t rule out a similar event happening again in the future because “social care is not a science. All we can do is be clear about our requirements and ensure professionals are properly supported.” However, in the past couple of years, authorities and nonprofits in the U.S. have been using the science of predictive analysis to save abused children.
Hillsborough County, Florida, saw a dramatic improvement after the launch of their analytics system in January 2013, as no children have died from abuse since that time. “We’ve been able to narrow down which cases are high risk, and stop doing cookie-cutter supervision,” said Paul Penhale, a case management supervisor at Gulf Coast Jewish Family & Community Services.
But detractors say that predictive analytics is just another way for Big Brother to interfere in people’s lives. The data may be wrong and single out families for harassment. “It scares the bejeezus out of me,” said Witold Walczak, legal director of the American Civil Liberties Union of Pennsylvania. “That should scare anybody. It’s like putting a name into a machine and making a determination about whether that person can have their child or not.”
Sign up for our free newsletter
Subscribe to the NPQ newsletter to have our top stories delivered directly to your inbox.
While these fears are valid, nonprofits are showing positive outcomes. For example, the Child Welfare and Policy Practice Group (CWG) lent support to an analytics project for the Florida Abuse Hotline. “Our research showed the tremendous positive effect of a visit from a caseworker,” said analytics expert Albert Blackmon. “But child protective services agencies across the country are overburdened. Analytics can help caseworkers identify the most at-risk kids as well as pinpoint the services that can lead to the most positive outcomes.”
The National Council on Crime and Delinquency (NCCD) works with agencies and jurisdictions on predictive analytics for child welfare. During an NCCD webinar on October 8th, Chief Program Officer Jesse Russell, PhD, gave examples of how predictive analytics improves services, specifically in the child welfare to juvenile justice crossover, hotline screening analysis, and the commercial sexual exploitation of children. He sees predictive analytics simply as a learning tool and “a way of looking at past experiences and how they might apply to the future.”
In response to a question about false positives, Dr. Russell said that “although these are innocuous when, say, Netflix predicts the wrong movie a person will like, they are obviously less so in interventions with families. If an agency is already intervening, analytics can help their decision. The idea isn’t to rely on analytics alone but to ask how we can engage lots of stakeholders in the community. What are our values and how are we responding to families? What are our responses, and how invasive should we be?”
“More data is better,” Dr. Russell went on to explain. “The biggest question is not the amount of data but the diversity of data, not just information on the child or family. Diversity makes the data as powerful as possible.”
Although mounds of data sound scary, the image of an all-knowing and infallible “Person of Interest” machine that spits out names is not the reality. A lack of data is actually a much scarier proposition. Without more data, overwhelmed case workers are more likely to make snap judgments based on their own biases and assumptions, and more children will be at risk of dying.—Amy Butcher