There doesn’t seem to be any point to which Oregon’s government won’t stoop in the name of “equity.” I’d find this unbelievable if it weren’t part of an ongoing pattern of racist behavior. This time it’s the Oregon Department of Human Services

In 2018, Oregon’s Department of Human Services implemented its Safety at Screening Tool, an algorithm that generates a “risk score” for abuse hotline workers, recommending whether a social worker needs to further investigate the contents of a call. This AI was based on the lauded Allegheny Family Screening Tool, designed to predict the risk of a child ending up in foster care based on a number of socioeconomic factors.

But after the Allegheny tool was found to be flagging a disproportionate number of black children for “mandatory” neglect, and a subsequent AP investigative report into the issue, Oregon officials now plan to shutter their derivative AI by the end of June in favor of an entirely new, and specifically less automated, review system.

As I never stop saying, disparity is not discrimination. If the Allegheny tool indicates more black children are being neglected, isn’t it possible that they are? Shouldn’t the welfare of children be the overriding concern and race play no part whatsoever? 

If the Allegheny tool were racist—if it used race as a factor in determining who should be flagged for review, for example—then it should never have been used in the first place. But that’s not how it works

The Allegheny Family Screening Tool is specifically designed to predict the risk that a child will be placed in foster care in the two years after they are investigated. Using a trove of detailed personal data collected from birth, Medicaid, substance abuse, mental health, jail and probation records, among other government data sets, the algorithm calculates a risk score of 1 to 20: The higher the number, the greater the risk.

And it’s not like the stakes aren’t incredibly high:

[Because] skipping a report of neglect could end with a child’s death but scrutinizing a family’s life could set them up for separation – the county and developers have suggested their tool can help “course correct” and make the agency’s work more thorough and efficient by weeding out meritless reports so that social workers can focus on children who truly need protection.

The developers have described using such tools as a moral imperative, saying child welfare officials should use whatever they have at their disposal to make sure children aren’t neglected.

“There are children in our communities who need protection,” said Emily Putnam-Hornstein, a professor at the University of North Carolina at Chapel Hill’s School of Social Work who helped develop the Allegheny tool, speaking at a virtual panel held by New York University in November.

Oregon has decided that race should be a factor in all this, and it’s shameful.