The Journal of Pathology: Clinical Research (Mar 2022)

Semantic annotation for computational pathology: multidisciplinary experience and best practice recommendations

  • Noorul Wahab,
  • Islam M Miligy,
  • Katherine Dodd,
  • Harvir Sahota,
  • Michael Toss,
  • Wenqi Lu,
  • Mostafa Jahanifar,
  • Mohsin Bilal,
  • Simon Graham,
  • Young Park,
  • Giorgos Hadjigeorghiou,
  • Abhir Bhalerao,
  • Ayat G Lashen,
  • Asmaa Y Ibrahim,
  • Ayaka Katayama,
  • Henry O Ebili,
  • Matthew Parkin,
  • Tom Sorell,
  • Shan E Ahmed Raza,
  • Emily Hero,
  • Hesham Eldaly,
  • Yee Wah Tsang,
  • Kishore Gopalakrishnan,
  • David Snead,
  • Emad Rakha,
  • Nasir Rajpoot,
  • Fayyaz Minhas

DOI
https://doi.org/10.1002/cjp2.256
Journal volume & issue
Vol. 8, no. 2
pp. 116 – 128

Abstract

Read online

Abstract Recent advances in whole‐slide imaging (WSI) technology have led to the development of a myriad of computer vision and artificial intelligence‐based diagnostic, prognostic, and predictive algorithms. Computational Pathology (CPath) offers an integrated solution to utilise information embedded in pathology WSIs beyond what can be obtained through visual assessment. For automated analysis of WSIs and validation of machine learning (ML) models, annotations at the slide, tissue, and cellular levels are required. The annotation of important visual constructs in pathology images is an important component of CPath projects. Improper annotations can result in algorithms that are hard to interpret and can potentially produce inaccurate and inconsistent results. Despite the crucial role of annotations in CPath projects, there are no well‐defined guidelines or best practices on how annotations should be carried out. In this paper, we address this shortcoming by presenting the experience and best practices acquired during the execution of a large‐scale annotation exercise involving a multidisciplinary team of pathologists, ML experts, and researchers as part of the Pathology image data Lake for Analytics, Knowledge and Education (PathLAKE) consortium. We present a real‐world case study along with examples of different types of annotations, diagnostic algorithm, annotation data dictionary, and annotation constructs. The analyses reported in this work highlight best practice recommendations that can be used as annotation guidelines over the lifecycle of a CPath project.

Keywords