Published on October 16th, 2022 📆 | 4001 Views ⚑
0How AI could be used to make life and death decisions
By the 2000s, an algorithm had been developed in the US to identify recipients for donated kidneys. But some people were unhappy with how the algorithm had been designed. In 2007, Clive Grawe, a kidney transplant candidate from Los Angeles, told a room full of medical experts that their algorithm was biased against older people like him. The algorithm had been designed to allocate kidneys in a way that maximized years of life saved. This favored younger, wealthier, and whiter patients, Grawe and other patients argued.
Such bias in algorithms is common. Whatâs less common is for the designers of those algorithms to agree that there is a problem. After years of consultation with laypeople like Grawe, the designers found a less biased way to maximize the number of years savedâby, among other things, considering overall health in addition to age. One key change was that the majority of donors, who are often people who have died young, would no longer be matched only to recipients in the same age bracket. Some of those kidneys could now go to older people if they were otherwise healthy. As with Scribnerâs committee, the algorithm still wouldnât make decisions that everyone would agree with. But the process by which it was developed is harder to fault.Â
âI didnât want to sit there and give the injection. If you want it, you press the button.â
Philip Nitschke
Nitschke, too, is asking hard questions.Â
A former doctor who burned his medical license after a years-long legal dispute with the Australian Medical Board, Nitschke has the distinction of being the first person to legally administer a voluntary lethal injection to another human. In the nine months between July 1996, when the Northern Territory of Australia brought in a law that legalized euthanasia, and March 1997, when Australiaâs federal government overturned it, Nitschke helped four of his patients to kill themselves.
The first, a 66-year-old carpenter named Bob Dent, who had suffered from prostate cancer for five years, explained his decision in an open letter: âIf I were to keep a pet animal in the same condition I am in, I would be prosecuted.â Â
Nitschke wanted to support his patientsâ decisions. Even so, he was uncomfortable with the role they were asking him to play. So he made a machine to take his place. âI didnât want to sit there and give the injection,â he says. âIf you want it, you press the button.â
The machine wasnât much to look at: it was essentially a laptop hooked up to a syringe. But it achieved its purpose. The Sarco is an iteration of that original device, which was later acquired by the Science Museum in London. Nitschke hopes an algorithm that can carry out a psychiatric assessment will be the next step.
But thereâs a good chance those hopes will be dashed. Creating a program that can assess someoneâs mental health is an unsolved problemâand a controversial one. As Nitschke himself notes, doctors do not agree on what it means for a person of sound mind to choose to die. âYou can get a dozen different answers from a dozen different psychiatrists,â he says. In other words, there is no common ground on which an algorithm could even be built.Â
Gloss