PITTSBURGH
For the two weeks that the Hackneys’ baby girl lay in a Pittsburgh hospital bed weak from dehydration, her parents rarely left her side, sometimes sleeping on the fold-out sofa in the room.
They stayed with their daughter around the clock when she was moved to a rehab center to regain her strength. Finally, the 8-month-old stopped batting away her bottles and started putting on weight again.
“She was doing well, and we started to ask when can she go home,” Lauren Hackney said. “And then from that moment on, at the time, they completely stonewalled us and never said anything.”
The couple was stunned when child welfare officials showed up, told them they were negligent and took their daughter away.
“They had custody papers and they took her right there and then,” Lauren Hackney recalled. “And we started crying.”
More than a year later, their daughter, now 2, remains in foster care. The Hackneys, who have developmental disabilities, are struggling to understand how taking their daughter to the hospital when she refused to eat could be seen as so neglectful that she’d need to be taken from her home.
They wonder if an artificial intelligence tool that the Allegheny County Department of Human Services uses to predict which children could be at risk of harm singled them out because of their disabilities.
The U.S. Justice Department is asking the same question. The agency is investigating the county’s child welfare system to determine whether its use of the influential algorithm discriminates against people with disabilities or other protected groups, The Associated Press has learned. Later this month, federal civil rights attorneys will interview the Hackneys and Andrew Hackney’s mother, Cynde Hackney-Fierro, the grandmother said.
Lauren Hackney has attention-deficit hyperactivity disorder that affects her memory, and her husband, Andrew, has a comprehension disorder and nerve damage from a stroke suffered in his 20s. Their baby girl was just 7 months old when she began refusing to drink her bottles. Facing a nationwide shortage of formula, they traveled from Pennsylvania to West Virginia looking for some and were forced to change brands. The baby didn’t seem to like it.
Her pediatrician first reassured them babies sometimes can be fickle with feeding and offered ideas to help her get back her appetite, they said.
When she grew lethargic days later, they said, the same doctor told them to take her to the emergency room. The Hackneys believe medical staff alerted child protective services after they showed up with a baby who was dehydrated and malnourished.
That’s when they believe their information was fed into the Allegheny Family Screening Tool, which county officials say is standard procedure for neglect allegations. Soon, a social worker appeared to question them, and their daughter was sent to foster care.
Over the past six years, Allegheny County has served as a real-world laboratory for testing AI-driven child welfare tools that crunch reams of data about local families to try to predict which children are likely to face danger in their homes. Today, child welfare agencies in at least 26 states and Washington, D.C., have considered using algorithmic tools, and jurisdictions in at least 11 have deployed them, according to the American Civil Liberties Union.
Child welfare algorithms plug vast amounts of public data about local families into complex statistical models to calculate what they call a risk score. The number that’s generated is then used to advise social workers as they decide which families should be investigated, or which families need additional attention — a weighty decision that can sometimes mean life or death.
A number of local leaders have tapped into AI technology while under pressure to make systemic changes, such as in Oregon during a foster care crisis and in Los Angeles County after a series of high-profile child deaths in one of the nation’s largest county child welfare systems.
Los Angeles County’s Department of Children and Family Services Director Brandon Nichols says algorithms can help identify high-risk families and improve outcomes in a deeply strained system. Yet he could not explain how the screening tool his agency uses works.
“We’re sort of the social work side of the house, not the IT side of the house,” Nichols said in an interview. “How the algorithm functions, in some ways is, I don’t want to say is magic to us, but it’s beyond our expertise and experience.”
In Larimer County, Colo., one official acknowledged she didn’t know what variables were used to assess local families.
In Pennsylvania, California and Colorado, county officials have opened up their data systems to the two academic developers who select data points to build their algorithms. Rhema Vaithianathan, a professor of health economics at New Zealand’s Auckland University of Technology, and Emily Putnam-Hornstein, a professor at the University of North Carolina at Chapel Hill’s School of Social Work, said in an email their work is transparent and that they make their computer models public.
“In each jurisdiction in which a model has been fully implemented we have released a description of fields that were used to build the tool, along with information as to the methods used,” they said by email.
Through tracking their work across the country the AP found their tools can set families up for separation by rating their risk based on personal characteristics they cannot change or control, such as race or disability, rather than just their actions as parents.
In Allegheny County, a sprawling county of 1.2 million near the Ohio border, the algorithm has accessed an array of external data, including jail, juvenile probation, Medicaid, welfare, health and birth records, all held in a vast countywide “data warehouse.” The tool uses that information to predict the risk that a child will be placed in foster care two years after a family is first investigated.
County officials have told the AP they’re proud of their cutting-edge approach. They have said they monitor their risk scoring tool closely and update it over time.
Vaithianathan and Putnam-Hornstein declined repeated requests to discuss how they choose the specific data that powers their models.
Oregon’s Department of Human Services built an algorithm inspired by Allegheny’s and factored in a child’s race as it predicted a family’s risk. It also applied a “fairness correction” to mitigate racial bias. Last June, the tool was dropped entirely due to equity concerns.
Historic bias against people with disabilities
With no answers on when they could get their daughter home, the Hackneys’ lawyer in October filed a federal civil rights complaint on their behalf that questioned how the screening tool was used in their case.
Over time, Allegheny’s tool has tracked if members of the family have diagnoses for schizophrenia or mood disorders. It’s also measured if parents or other children in the household have disabilities, by noting whether any family members received Supplemental Security Income, a federal benefit for people with disabilities. The county said it factors in SSI payments in part because children with disabilities are more likely to be abused or neglected.
The county also said disabilities-aligned data can be “predictive of the outcomes” and it “should come as no surprise that parents with disabilities … may also have a need for additional supports and services.”
The Hackneys have been ordered to take parenting classes and say they have been taxed by all of the child welfare system’s demands, including IQ tests and downtown court hearings.
People with disabilities are overrepresented in the child welfare system, yet there’s no evidence that they harm their children at higher rates, said Traci LaLiberte, a University of Minnesota expert on child welfare and disabilities.
Including data points related to disabilities in an algorithm is problematic because it perpetuates historic biases in the system and it focuses on people’s physiological traits rather than behavior that social workers are brought in to address, LaLiberte said.
The Los Angeles tool weighs if any children in the family have ever gotten special education services, have had prior developmental or mental health referrals or used drugs to treat mental health.
“This is not unique to caseworkers who use this tool; it is common for caseworkers to consider these factors when determining possible supports and services,” the developers said by email.
Before algorithms were in use, the child welfare system had long distrusted parents with disabilities. Into the 1970s, they were regularly sterilized and institutionalized, LaLiberte said. A landmark federal report in 2012 noted parents with psychiatric or intellectual disabilities lost custody of their children as much as 80% of the time.
Across the U.S., it’s extremely rare for any child welfare agencies to require disabilities training for social workers, LaLiberte’s research has found. The result: Parents with disabilities are often judged by a system that doesn’t understand how to assess their capacity as caregivers, she said.
The Hackneys experienced this firsthand. When a social worker asked Andrew Hackney how often he fed the baby, he answered literally: two times a day. The worker seemed appalled, he said, and scolded him, saying babies must eat more frequently. He struggled to explain that the girl’s mother, grandmother and aunt also took turns feeding her each day.
Meanwhile, foster care and the separation of families can have lifelong developmental consequences for the child.
The Hackneys’ daughter already has been placed in two foster homes and has now spent more than half of her short life away from her parents as they try to convince social workers they are worthy.
Meanwhile, they say they’re running out of money in the fight for their daughter. With barely enough left for food from Andrew Hackney’s wages at a local grocery store, he had to shut off his monthly cellphone service. They’re struggling to pay for the legal fees and gas money needed to attend appointments required of them.
In February, their daughter was diagnosed with a disorder that can disrupt her sense of taste, according to Andrew Hackney’s lawyer, Robin Frank, who added that the girl has continued to struggle to eat, even in foster care.
All they have for now are twice-weekly visits that last a few hours before she’s taken away again. Lauren Hackney’s voice breaks as she worries her daughter may be adopted and soon forget her own family. They say they yearn to do what many parents take for granted — put their child to sleep at night in her own bed.
“I really want to get my kid back. I miss her, and especially holding her. And of course, I miss that little giggly laugh,” Andrew Hackney said, as his daughter sprang toward him with excitement during a recent visit. “It hurts a lot. You have no idea how bad.”