End of life decisions are difficult and distressing. Could AI help?

End of life decisions are difficult and distressing. Could AI help?

Wendler has been working on ways to help surrogates make these kinds of decisions. Over ten years ago, he developed the idea for a tool that would predict a patient’s preferences based on a set of characteristics, such as their age, gender and insurance status. That tool would have been based on a computer algorithm trained on survey results from the general population. It may seem crude, but these characteristics do seem to influence how people feel about medical care. A teenager is more likely to opt for aggressive treatment than a 90-year-old, for example. And research suggests that predictions based on averages can be more accurate than the guesses made by family members.

In 2007, Wendler and his colleagues built a “very basic” preliminary version of this tool based on a small amount of data. That simplistic tool did “at least as well as next of kin surrogates” in predicting what kind of care people would want, says Wendler.

Now, Wendler, Earp and their colleagues are working on a new idea. Instead of crude characteristics, the team plans to build a tool that will be personalized. The team proposes using AI and machine learning to predict a patient’s treatment preferences based on their personal data, such as their medical history, along with emails, personal messages, web browsing history, social media posts or even Facebook likes. The result would be a “digital psychological twin” of a person—a tool that doctors and family members could consult, and one that could guide a person’s medical care. It’s not yet clear what this would look like in practice, but the team hopes to build and test the tool before refining it.

The team call their tool a personalized patient preference predictor, or P4 for short. In theory, if it works as they hope, it could be more accurate than the previous version of the tool—and more accurate than human surrogates, says Wendler. It could be more reflective of a patient’s current thinking than an advanced directive, which might have been signed a decade beforehand, says Earp.

A better bet?

A tool like the P4 could also help relieve surrogates from some of the emotional burden of making such significant life-or-death decisions about their family members, which can sometimes leave surrogates with symptoms of post traumatic stress disorder, says Jennifer Blumenthal-Barby, a medical ethicist at Baylor College of Medicine in Texas.

Some surrogates experience “decisional paralysis”—they might opt to use the tool to help steer them through a decision-making process, says Kaplan. In cases like these, the P4 could help relieve surrogates from some of the burden they might be feeling, without necessarily giving them a black-and-white answer. It might, for example, suggest that a person was “likely” or “unlikely” to feel a certain way about a treatment, or give a percentage score indicating how likely the answer is to be right or wrong.