Middle East Journal of Cancer (Jul 2024)
Can Dietary Iron Bioavailability Influence Colorectal Cancer Risk and Prognosis?
Abstract
Colorectal cancer (CRC) stands apart from other malignancies due to its pronounced association with dietary patterns. Approximately 70% of all CRC cases arise sporadically, and suboptimal dietary and lifestyle choices can override certain predisposing factors, including a family history of the disease. Hitherto, the most compelling evidence linking CRC risk has been attributed to heme iron, predominantly found in red and processed meats, although this form of iron constitutes a mere 20% of total dietary iron. The human organism maintains a remarkably intricate and tightly regulated iron homeostasis system owing to the deleterious consequences of both excessive and deficient serum iron levels. Dietary sources remain the sole means to replenish iron losses. Despite the abundant presence of iron in various food sources, its absorption, commonly referred to as bioavailability, is notably restricted due to an array of dietary inhibitors and homeostatic mechanisms.Consequently, a substantial 80% of ingested dietary iron is excreted in fecal matter, resulting in fecal iron concentrations that surpass those found in most body tissues by a tenfold margin. Prolonged exposure of the colorectum to excessive fecal iron, combined with concurrent physiological alterations, can instigate oncogenic processes leading to CRC. Notably, despite their recognized significance in CRC pathology, dietary habits, and lifestyle factors have been sporadically integrated into predictive models, primarily concerning CRC recurrence. Nonetheless, these models exhibit disparities in the dietary components, rendering them non-universally applicable. In light of these disparities, postulating that incorporating bioavailable iron, in conjunction with hepcidin levels, may offer superior predictive value for CRC risk assessment, and herein, elucidates the scientific foundation supporting this hypothesis.
Keywords