AI Denial by Default: Centene’s Silent Shift to Automated Healthcare Gatekeeping

Late last year, a mother in Ohio opened a denial letter from her son’s Medicaid provider. The document looked ordinary: clinical codes, technical terms, a vague line about “not meeting criteria.” What she didn’t know, and what few patients ever did, was that no doctor had made the decision. No nurse. No human at all.

Behind the scenes, Centene Corporation, America’s largest Medicaid managed care company, has reportedly begun expanding its use of artificial intelligence to handle care decisions. Not necessarily to improve outcomes or speed, but in many cases, to issue rejections before a human ever gets involved.

And those denials are quietly increasing.

The Shift to Silent Automation

In 2023, Centene began scaling up AI-based decision systems across multiple state Medicaid programs. These systems do not merely assist human reviewers. They score prior authorization requests and, in many instances, issue initial decisions on coverage, often without clinical review.

What sounds like innovation has, in practice, introduced a growing gap between patients and the care they are entitled to.

According to a former subcontractor who spoke under condition of anonymity, the system’s core purpose appeared to be cost containment. “The software doesn’t flag approvals,” they said. “From what I saw, it seemed designed to prioritize denial triggers.”

Rubber Stamps in Scrubs

Several former Centene nurses have described a disturbing pattern. Instead of reviewing medical cases independently, they were often given pre-filled denial forms generated by the AI system. Their role, they said, was not to assess but to sign off.

“I was told I didn’t need to understand the decision. Just to confirm the form was sent,” said a utilization nurse from Texas who has since filed a wrongful termination claim.

Another nurse based in California recalled a case involving a child with complex neurological needs. When she questioned the denial of occupational therapy, which was labeled “not medically necessary,” a supervisor allegedly told her, “That’s not our job.”

Opaque Letters, No Explanation

For patients, the process can be frustrating and disorienting. The denial letters offer little detail and no indication of who, or what, made the decision. Appeals procedures are often buried in bureaucratic language, leaving families with few clear options.

In one case, a father in Missouri struggled for weeks to appeal the denial of a prosthetic limb for his son. The original and follow-up decisions, he later learned, had been issued by software.

“It’s like trying to argue with a black box,” he said. “And the box never talks back.”

The Legal Gray Area

While other insurers have also adopted automation, what sets Centene apart is the scale and opacity of its implementation. Health policy experts warn that such systems may undermine patients’ legal rights, especially when automated decisions replace licensed medical judgment.

“When algorithms make life-and-death decisions without oversight or transparency, that’s not innovation,” said Dr. Reema Shah, a health law professor at Columbia. “That’s abdication of responsibility.”

Legal scholars also point out that the use of automation in Medicaid decisions raises due process concerns, particularly when patients are not informed of how the decisions are made or how to challenge them effectively.

What the Numbers Show

Centene reported $6.5 billion in Medicaid-related profit in 2024, an 8 percent increase from the prior year. Analysts attributed this growth in part to “administrative efficiency improvements,” a phrase often used to describe automation.

Behind that efficiency, however, are real human consequences.

Public complaint data shows a 22 percent increase in grievances related to denial of care across Centene’s five largest Medicaid markets. Several states, including New Mexico and Texas, have reportedly launched quiet inquiries into the volume of what they term “automated denials.”

Burnout from Within

The effects extend beyond patients. Inside Centene’s internal review teams and call centers, current and former employees describe growing burnout. They say automation has created an “assembly line” atmosphere, where complexity is discouraged and speed is prioritized.

“We were evaluated not on how well we served patients, but on how many decisions we processed per hour,” said one former medical reviewer. “It felt dehumanizing. And when I raised concerns, I was reassigned away from clinical reviews.”

What Comes Next?

Lawmakers have taken notice. Senator Ron Wyden and others have called for full transparency around AI in healthcare decision-making. A bipartisan effort is underway to push for federal standards that would require all AI-based denials to undergo human review and include clear, plain-language explanations for affected patients.

Whether such legislation passes remains to be seen. But what is clear is that the current system, left unchecked, risks further eroding trust in Medicaid’s ability to deliver care.

Conclusion

Centene’s shift toward AI-driven denials is not just a technology story. It is a human one. It raises urgent questions about whether software should be making decisions in a system designed to serve society’s most vulnerable.

Automation in healthcare can be powerful, but only when it is used to support care, not to silently deny it.

The door to care is closing for many. The question now is, who’s watching?

Leave a Reply