Transplantation Direct (Feb 2022)

Building a Utility-based Liver Allocation Model in Preparation for Continuous Distribution

  • Catherine E. Kling, MD, MPH,
  • James D. Perkins, MD, MS,
  • Scott W. Biggins, MD, MS,
  • Anji E. Wall, MD, PhD,
  • Jorge D. Reyes, MD

DOI
https://doi.org/10.1097/TXD.0000000000001282
Journal volume & issue
Vol. 8, no. 2
p. e1282

Abstract

Read online

Background. The current model for end-stage liver disease-based liver allocation system in the United States prioritizes sickest patients first at the expense of long-term graft survival. In a continuous distribution model, a measure of posttransplant survival will also be included. We aimed to use mathematical optimization to match donors and recipients based on quality to examine the potential impact of an allocation system designed to maximize long-term graft survival. Methods. Cox proportional hazard models using organ procurement and transplantation network data from 2008 to 2012 were used to place donors and waitlist candidates into 5 groups of increasing risk for graft loss (1—lowest to 5—highest). A mixed integer programming optimization model was then used to generate allocation rules that maximized graft survival at 5 and 8 y. Results. Allocation based on mathematical optimization improved 5-y survival by 7.5% (78.2% versus 70.7% in historic cohort) avoiding 2271 graft losses, and 8-y survival by 9% (71.8% versus 62.8%) avoiding 2725 graft losses. Long-term graft survival for recipients within a quality group is highly dependent on donor quality. All candidates in groups 1 and 2 and 43% of group 3 were transplanted, whereas none of the candidates in groups 4 and 5 were transplanted. Conclusions. Long-term graft survival can be improved using a model that allocates livers based on both donor and recipient quality, and the interaction between donor and recipient quality is an important predictor of graft survival. Considerations for incorporation into a continuous distribution model are discussed.