Racial bias is pervasive in medical guidelines for care

Print Friendly, PDF & Email

A new study has found that racial inequities are unintentionally baked into algorithms, flowcharts of decision points that assess patient risks, used by providers to determine who will benefit from what care. A previous study found that a widely-used algorithm that predicts which patients will benefit from care management was unintentionally biased against black patients. The new study from the New England Journal of Medicine finds that those biases are common among algorithms. For example, the American Heart Association’s Heart Failure Risk Score assigns three extra points to “non-black” patients, signaling that white patients are at higher risk and more likely to be referred to resources. Black and Hispanic women are less likely to have a successful vaginal birth after a previous caesarian delivery according to another algorithm steering them toward c-sections. There is clear evidence that vaginal births have fewer complications and growing concerns about higher rates of maternal mortality for black women.  The paper outlines ten other examples of racial bias in algorithms.