Scientific Reports (Nov 2024)
Predicting early mortality in hemodialysis patients: a deep learning approach using a nationwide prospective cohort in South Korea
Abstract
Abstract Early mortality after hemodialysis (HD) initiation significantly impacts the longevity of HD patients. This study aimed to quantify the effect sizes of risk factors on mortality using various machine learning approaches. A cohort of 3284 HD patients from the CRC-ESRD (2008–2014) was analyzed. Mortality risk models were validated using logistic regression, ridge regression, lasso regression, and decision trees, as well as ensemble methods like bagging and random forest. To better handle missing data and time-series variables, a recurrent neural network (RNN) with an autoencoder was also developed. Additionally, survival models predicting hazard ratios were employed using survival analysis techniques. The analysis included 1750 prevalent and 1534 incident HD patients (mean age 58.4 ± 13.6 years, 59.3% male). Over a median follow-up of 66.2 months, the overall mortality rate was 19.3%. Random forest models achieved an AUC of 0.8321 for first-year mortality prediction, which was further improved by the RNN with autoencoder (AUC 0.8357). The survival bagging model had the highest hazard ratio predictability (C-index 0.7756). A shorter dialysis duration (< 14.9 months) and high modified Charlson comorbidity index scores (7–9) were associated with hazard ratios up to 7.76 (C-index 0.7693). Comorbidities were more influential than age in predicting early mortality. Monitoring dialysis adequacy (KT/V), RAAS inhibitor use, and urine output is crucial for assessing early prognosis.
Keywords