AP

An algorithm that screens for child neglect raises concerns

Apr 28, 2022, 10:10 PM | Updated: Apr 29, 2022, 10:59 am

The Family Law Center in Pittsburgh is seen on Wednesday, March 16, 2022. Around the country, as ch...

The Family Law Center in Pittsburgh is seen on Wednesday, March 16, 2022. Around the country, as child welfare agencies use or consider algorithmic tools like in Allegheny County, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. (AP Photo/Matt Rourke)

(AP Photo/Matt Rourke)

For family law attorney Robin Frank, defending parents at one of their lowest points — when they risk losing their children — has never been easy.

The job is never easy, but in the past she knew what she was up against when squaring off against child protective services in family court. Now, she worries she’s fighting something she can’t see: an opaque algorithm whose statistical calculations help social workers decide which families should be investigated in the first place.

“A lot of people don’t know that it’s even being used,” Frank said. “Families should have the right to have all of the information in their file.”

From Los Angeles to Colorado and throughout Oregon, as child welfare agencies use or consider tools similar to the one in Allegheny County, Pennsylvania, an Associated Press review has identified a number of concerns about the technology, including questions about its reliability and its potential to harden racial disparities in the child welfare system. Related issues have already torpedoed some jurisdictions’ plans to use predictive models, such as the tool notably dropped by the state of Illinois.

According to new research from a Carnegie Mellon University team obtained exclusively by AP, Allegheny’s algorithm in its first years of operation showed a pattern of flagging a disproportionate number of Black children for a “mandatory” neglect investigation, when compared with white children. The independent researchers, who received data from the county, also found that social workers disagreed with the risk scores the algorithm produced about one-third of the time.

County officials said that social workers can always override the tool, and called the research “hypothetical.”

Child welfare officials in Allegheny County, the cradle of Mister Rogers’ TV neighborhood and the icon’s child-centric innovations, say the cutting-edge tool – which is capturing attention around the country – uses data to support agency workers as they try to protect children from neglect. That nuanced term can include everything from inadequate housing to poor hygiene, but is a different category from physical or sexual abuse, which is investigated separately in Pennsylvania and is not subject to the algorithm.

“Workers, whoever they are, shouldn’t be asked to make, in a given year, 14, 15, 16,000 of these kinds of decisions with incredibly imperfect information,” said Erin Dalton, director of the county’s Department of Human Services and a pioneer in implementing the predictive child welfare algorithm.

____

This story, supported by the Pulitzer Center for Crisis Reporting, is part of an ongoing Associated Press series, “Tracked,” that investigates the power and consequences of decisions driven by algorithms on people’s everyday lives.

____

Critics say it gives a program powered by data mostly collected about poor people an outsized role in deciding families’ fates, and they warn against local officials’ growing reliance on artificial intelligence tools.

If the tool had acted on its own to screen in a comparable rate of calls, it would have recommended that two-thirds of Black children be investigated, compared with about half of all other children reported, according to another study published last month and co-authored by a researcher who has audited the county’s algorithm.

Advocates worry that if similar tools are used in other child welfare systems with minimal or no human intervention — akin to how algorithms have been used to make decisions in the criminal justice system — they could reinforce existing racial disparities in the child welfare system.

“It’s not decreasing the impact among Black families,” said Logan Stapleton, a researcher at Carnegie Mellon University. “On the point of accuracy and disparity, (the county is) making strong statements that I think are misleading.”

Because family court hearings are closed to the public and the records are sealed, AP wasn’t able to identify first-hand any families who the algorithm recommended be mandatorily investigated for child neglect, nor any cases that resulted in a child being sent to foster care. Families and their attorneys can never be sure of the algorithm’s role in their lives either because they aren’t allowed to know the scores.

Child welfare agencies in at least 26 states and Washington, D.C., have considered using algorithmic tools, and at least 11 have deployed them, according to American Civil Liberties Union.

Larimer County, Colorado, home to Fort Collins, is now testing a tool modeled on Allegheny’s and plans to share scores with families if it moves forward with the program.

“It’s their life and their history,” said Thad Paul, a manager with the county’s Children Youth & Family Services. “We want to minimize the power differential that comes with being involved in child welfare … we just really think it is unethical not to share the score with families.”

Oregon does not share risk score numbers from its statewide screening tool, which was first implemented in 2018 and was inspired by Allegheny’s algorithm. The Oregon Department of Human Services – currently preparing to hire its eighth new child welfare director in six years – explored at least four other algorithms while the agency was under scrutiny by a crisis oversight board ordered by the governor.

It recently paused a pilot algorithm built to help decide when foster care children can be reunified with their families. Oregon also explored three other tools – predictive models to assess a child’s risk for death and severe injury, whether children should be placed in foster care and if so, where.

For years, California explored data-driven approaches to the statewide child welfare system before abandoning a proposal to use a predictive risk modeling tool in 2019.

“During the project, the state also explored concerns about how the tool may impact racial equity. These findings resulted in the state ceasing exploration,” department spokesman Scott Murray said in an email.

Los Angeles County’s Department of Children and Family Services is being audited following high-profile child deaths, and is seeking a new director after its previous one stepped down late last year. It’s piloting a “complex-risk algorithm” that helps to isolate the highest-risk cases that are being investigated, the county said.

In the first few months that social workers in the Mojave Desert city of Lancaster started using the tool, however, county data shows that Black children were the subject of nearly half of all the investigations flagged for additional scrutiny, despite making up 22% of the city’s child population, according to the U.S. Census.

The county did not immediately say why, but said it will decide whether to expand the tool later this year.

___

Associated Press reporter Camille Fassett contributed to this report.

___

Follow Sally Ho and Garance Burke on Twitter at @_sallyho and @garanceburke.

___

Contact AP’s global investigative team at Investigative@ap.org or https://www.ap.org/tips/

Copyright © The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

AP

southern Arizona rancher George Alan Kelly...

Associated Press

Trial of a southern Arizona rancher charged in fatal shooting of unarmed migrant goes to the jury

Closing arguments were made against a southern Arizona rancher accused of shooting an undocumented migrant on his land to death on Thursday.

20 hours ago

Donald Trump's hush money trial: 12 jurors selected...

Associated Press

Although 12 jurors were picked for Donald Trump’s hush money trial, selection of alternates is ongoing

A jury of 12 people was seated Thursday in former President Donald Trump's hush money trial. The proceedings are close to opening statements.

22 hours ago

A anti-abortion supporter stands outside the House chamber, Wednesday, April 17, 2024, at the Capit...

Associated Press

Democrats clear path to bring proposed repeal of Arizona’s near-total abortion ban to a vote

Democrats in the Arizona Senate cleared a path to bring a proposed repeal of the state’s near-total ban on abortions to a vote.

2 days ago

Most Americans are sleepy new Gallup poll finds...

Associated Press

Most Americans say they don’t get enough sleep, according to new Gallup poll

A new Gallup poll found that most Americans are sleepy — or, at least, they say they are. Multiple factors play into this.

4 days ago

Near-total abortion ban in Arizona dates back to Civil War era...

Associated Press

Near-total abortion ban dates back to 1864, during the Civil War, before Arizona was a state

The near-total abortion ban resurrected last week by the Arizona Supreme Court dates to 1864, when settlers were encroaching on tribal lands.

4 days ago

Tracy Toulou...

Associated Press

How to tackle crime in Indian Country? Empower tribal justice, ex-Justice Department official says

A recently retired director of the Justice Dept. says the federal government hasn't given tribal justice systems equal recognition.

5 days ago

Sponsored Articles

...

COLLINS COMFORT MASTERS

Here are 5 things Arizona residents need to know about their HVAC system

It's warming back up in the Valley, which means it's time to think about your air conditioning system's preparedness for summer.

...

Midwestern University

Midwestern University Clinics: transforming health care in the valley

Midwestern University, long a fixture of comprehensive health care education in the West Valley, is also a recognized leader in community health care.

...

Fiesta Bowl Foundation

The 51st annual Vrbo Fiesta Bowl Parade is excitingly upon us

The 51st annual Vrbo Fiesta Bowl Parade presented by Lerner & Rowe is upon us! The attraction honors Arizona and the history of the game.

An algorithm that screens for child neglect raises concerns