Journal of Pipeline Science and Engineering (Mar 2021)

Evolution of metal-loss severity criteria: Gaps and a path forward

  • Brian N. Leis

Journal volume & issue
Vol. 1, no. 1
pp. 51 – 62

Abstract

Read online

This paper considered integrity management (IM) based on the convergence of inspection technologies with those for defect assessment, as occurred circa 2000. A reformulation of the assessment criterion PCORRC, termed R-PCORRC, was presented, which now embeds a function of the steel’s strain-hardening response. Relative to full-scale tests for Grades from B up through and beyond X100 (L690) this criterion was shown accurate within 1%, with a low coefficient of variation of just 0.077. In contrast, B31G and Modified B31G showed significant bias, and scatter. The highly selective sometimes significant effect of width was shown to depend on the structural stiffness within the metal loss as compared to the stiffness of the pipe surrounding it. Defect assessment based on ILI of real corrosion was considered next, to identify technology gaps in the practical applications of this technology. It was apparent that improvements in MFL signal interpretation will help to resolve the issues with boxing criteria. While such concerns currently can be managed using the concept of plausible paths, as feature size and interaction are better characterized this concern will diminish. In such cases the predictive outcome of the plausible paths concept will in the limit tend toward that of B31G.The need for and benefits of accurate precise predictive schemes for metal-loss severity were illustrated. It was shown that a conservative prediction of failure pressure gives rise to a non-conservative prediction of defect size, leading to a non-conservative re-inspection interval for a pipeline. Finally, it was also shown that using an accurate scheme can result in significantly reduced maintenance, with a related benefit in a reduced scope of field digs to demonstrate the viability of inspection-based IM.

Keywords