One of the algorithms used in American hospitals to determine patient treatment scenarios turned out to be biased towards race: for dark-skinned and white-skinned patients, it assigns the same level of risk of developing certain conditions even though black-skinned patients are on average significantly less healthy. The reason for this turned out to be that the algorithm uses expenditures on medical services to determine the health of patients.
One of the obvious advantages of using automatic systems for information analysis is the lack of a human factor that very often affects the result. A vivid example is an error of “search satisfaction” that is often encountered in medical diagnostics: if, for example, one fracture is found in an X-ray, the specialist may not notice the second. Computer vision algorithms do not face such an error.
Another example is the analysis of various profiles of people: for example, when applying for a job or to provide the necessary medical care. In fact, algorithms that specialize in such an analysis should be unbiased and work without discrimination on any grounds. In fact, the work of such algorithms is still based on data collected by people, and bias cannot be avoided: for example, remember the story with the algorithm for analyzing profiles when applying for a job from Amazon, which was accused of sexism.
In this case, bias can occur even in cases where the algorithm does not directly address parameters such as gender or race. This new work was shown by scientists led by Ziad Obermeyer from the University of California at Berkeley. They analyzed data on more than 50 thousand patients that are used in one of the medical computer programs, which automatically determines the possible ways of treating chronic diseases.
The researchers found that, according to the algorithm, black people have 26.3 per cent more chronic diseases than white people: in other words, their health in the sample is much worse. If we talk about specific diseases and conditions, then among dark-skinned people, blood pressure and cholesterol level are higher, as well as the state of diabetes is worse. Interestingly, such a relationship is observed for any risk relationship: in other words, with the same risk of developing any conditions, diseases and death, black patients have significantly worse health. This, in turn, leads to the fact that white patients with relative health in comparison with dark-skinned receive more specialized medical care.
Interestingly, the race in the algorithm is not taken into account at all – while bias still arises. After analyzing the operation of the algorithm, scientists found that calculating the risks of diseases and the need for treatment, the system primarily takes into account the costs of treatment. On the one hand, this is reasonable: serious conditions require more expensive treatment. On the other hand, lower medical expenses reflect the socioeconomic status and standard of living of patients, which may be lower for black people.
In the work itself, it is not reported which particular algorithm is involved. However, The Washington Post claimed so that we are talking about one of the medical company Optum program: this is indicated in the editorial note journal Science. In the article, the authors note that they had no contact with company representatives until the publication of the work, but clarified that they were able to work with developers and reduce racial bias in his work by 84 percent.
The bias of the algorithms due to the bias of the sample itself is sometimes at hand: for example, last year, scientists in a large body of texts managed to track how attitudes toward women and Asians changed.