What Would a Predictive Analytics Algorithm Say About This?

Let’s take a trip into the near future. Just a couple of years.

Child Protective Services has just received a child maltreatment report concerning a father of five. With a few keystrokes, CPS workers find out the following about him:

He’s married, but the family lives in deep poverty. He has a criminal record, a misdemeanor conviction. He and his wife also had the children taken away from them; they were returned after six months.

These data immediately are entered into a computer programmed with the latest predictive analytics software. And quicker than you can say “danger, Will Robinson!” the computer warns CPS that this guy is high-risk.

When the caseworker gets to the home, she knows the risk score is high, so if she leaves those children at home and something goes wrong, she’ll have even more than usual to answer for.

No matter what the actual report – since in this new modern age of “pre-crime” making determinations based on what actually may have happened is so passe – those children are likely to be taken away, again.

So, now let’s return to the present and meet the family at the center of the actual case on which this hypothetical is based:

Dennison_Wexler
Photo: www.fox26houston.com

In the hypothetical, I changed two things about this story. First, the story mentions no criminal charges, and, in fact, panhandling is illegal only in some parts of Houston. But predictive analytics tends not to factor in Anatole France’s famous observation that “the law, in its majestic equality, forbids rich and poor alike to sleep under bridges, to beg in the streets, and to steal bread.”

So had there been a criminal conviction, or even a charge, it almost certainly would have added to the risk score.

And second, I’m assuming Dennison and his wife actually will get their children back. In fact, there’s no telling what will happen, and the family is under the impression that CPS is pursuing termination of parental rights.

What we do know is that in the brave new world of predictive analytics, if Dennison’s children ever are returned, and if Dennison ever is reported again, the children are likely to be removed again. And, what with it then being the second time and all, they’re more likely to stay removed forever.

For now, the parents don’t know where their children are. But given that this is Texas foster care we’re talking about, odds are it’s nowhere good.

I can hear the predictive analytics evangelists now: “You don’t understand,” they’ll say. “We would just use this tool to help people like Dennison and his family.”

And yes, there are a very few enlightened child protective services agencies that would do that. But when Houston CPS encountered the Dennison family that’s not what they did. They did not offer emergency cash assistance. They did not offer assistance to Dennison to find another job, or train for a new one.

They took the children and ran. Just as Houston CPS did in another case, where they rushed to confuse poverty with neglect.

An algorithm won’t make these decisions any better.  They’ll just make it easier to take the child and run.

Print Friendly, PDF & Email

Richard Wexler
About Richard Wexler 51 Articles
Richard Wexler is Executive Director of the National Coalition for Child Protection Reform, www.nccpr.org. His interest in child welfare grew out of 19 years of work as a reporter for newspapers, public radio and public television. During that time, he won more than two dozen awards, many of them for stories about child abuse and foster care. He is the author of Wounded Innocents: The Real Victims of the War Against Child Abuse (Prometheus Books: 1990, 1995).

4 Comments

  1. I have never been a fan of predictive analytics for direct delivery in social work. But I recently heard of a project out of Ft Worth that layers a number of factors then analyses the data over a 3 block capture area to determine needed community services.

    Now I would have no problem with looking at that data to help tailor services for a family who falls within that capture area. That would be the kind of stuff needed for prevention programs. As these programs prove to be effective it would be easier to apply improvements to the complete community.

    But never do I want to see a “Minority Report”-style of service

  2. Why are you defending the actions of these workers Joshua? It’s so obviously true! This is modern day slave trade! If there is a reason to take ANY REASON AT ALL they take! Look at the Justina Pelletier case! HELLO! kids aren’t safe anymore! Now a complete stranger can come in your life and determine something is wrong with you because things aren’t perfect with the world we live in? How does this man being poor cause him to be a physical threat to his children?! Was there food in a locked cabinet? Was there a Mercedes he wasn’t selling?! Was he panhandling while his kids were wearing Oshkosh?
    These people make me sick and so do you for defending them! 😡

  3. Richard,

    I’ve written about this issue for this site and I’ve detailed why I think your argument exhibits a fundamental misunderstanding of what this kind of technology does. Predictive analytics systems show incredible potential to eliminate the kind of uninformed decision-making you lament. These are decision-support systems, giving social workers more objective and reliable information faster than ever before. It’s not a “Minority Report”-style dystopia where people would act without evidence.

    If your concern is that these systems don’t paint a complete picture of a child’s situation at this stage in their development, then you should be advocating for more robust analytics systems that can do so, not lamenting their existence. And if your concern is that Child Protective Services “did not offer assistance to Dennison to find another job, or train for a new one,” (which sounds like a fine argument) then why are you attacking the use of a technology designed to identify children at risk of abuse?

    More here: https://chronicleofsocialchange.org/opinion/15852/15852

Comments are closed.