Alexandria Engineering Journal (Jul 2024)

Enhancing AI interpretation and decision-making: Integrating cognitive computational models with deep learning for advanced uncertain reasoning systems

  • Franciskus Antonius Alijoyo,
  • S. Janani,
  • Kathari Santosh,
  • Safa N. Shweihat,
  • Nizal Alshammry,
  • Janjhyam Venkata Naga Ramesh,
  • Yousef A. Baker El-Ebiary

Journal volume & issue
Vol. 99
pp. 17 – 30

Abstract

Read online

Advancements in uncertain reasoning systems within healthcare are crucial for navigating the complexities of patient data, requiring innovative methodologies that integrate AI interpretation capabilities and robust handling of inherent ambiguity. Healthcare systems face the challenge of handling uncertainty inherent in patient data, necessitating sophisticated decision-making tools like Uncertain Reasoning Systems (URS) for effective ambiguity navigation. Recognizing the complexity of healthcare scenarios, advancements in AI interpretation within URS are crucial beyond traditional methods. Conventional techniques like statistical approaches and rule-based systems often prove inadequate due to their rigid frameworks and limited ability to manage inherent ambiguity. This paper proposes an innovative methodology that integrates Min-Max normalization and robust missing data handling techniques with Hybrid Fuzzy Rule-Based Systems and Neural Networks, supplemented by Game Theory for model refinement. Through the integration of Game Theory, it can dynamically adjust its strategies to healthcare data uncertainties, thereby enhancing its resilience and efficacy. Implemented using Python tools, the proposed system achieves an exceptional 99.4 % accuracy, surpassing baseline methods such as FNN (88.1 %) and Naïve Bayes (90 %), highlighting its superior performance in healthcare decision-making. These findings represent significant strides in AI interpretation and decision-making within Uncertain Reasoning Systems, underscoring the practical relevance of the proposed approach.

Keywords