Frontiers in Immunology (Jun 2023)
Maximizing utility of nondirected living liver donor grafts using machine learning
Abstract
ObjectiveThere is an unmet need for optimizing hepatic allograft allocation from nondirected living liver donors (ND-LLD).Materials and methodUsing OPTN living donor liver transplant (LDLT) data (1/1/2000-12/31/2019), we identified 6328 LDLTs (4621 right, 644 left, 1063 left-lateral grafts). Random forest survival models were constructed to predict 10-year graft survival for each of the 3 graft types.ResultsDonor-to-recipient body surface area ratio was an important predictor in all 3 models. Other predictors in all 3 models were: malignant diagnosis, medical location at LDLT (inpatient/ICU), and moderate ascites. Biliary atresia was important in left and left-lateral graft models. Re-transplant was important in right graft models. C-index for 10-year graft survival predictions for the 3 models were: 0.70 (left-lateral); 0.63 (left); 0.61 (right). Similar C-indices were found for 1-, 3-, and 5-year graft survivals. Comparison of model predictions to actual 10-year graft survivals demonstrated that the predicted upper quartile survival group in each model had significantly better actual 10-year graft survival compared to the lower quartiles (p<0.005).ConclusionWhen applied in clinical context, our models assist with the identification and stratification of potential recipients for hepatic grafts from ND-LLD based on predicted graft survivals, while accounting for complex donor-recipient interactions. These analyses highlight the unmet need for granular data collection and machine learning modeling to identify potential recipients who have the best predicted transplant outcomes with ND-LLD grafts.
Keywords