Healthcare Informatics Research (Jan 2022)

ANNO: A General Annotation Tool for Bilingual Clinical Note Information Extraction

  • Kye Hwa Lee,
  • Hyunsung Lee,
  • Jin-Hyeok Park,
  • Yi-Jun Kim,
  • Youngho Lee

DOI
https://doi.org/10.4258/hir.2022.28.1.89
Journal volume & issue
Vol. 28, no. 1
pp. 89 – 94

Abstract

Read online

Objectives This study was conducted to develop a generalizable annotation tool for bilingual complex clinical text annotation, which led to the design and development of a clinical text annotation tool, ANNO. Methods We designed ANNO to enable human annotators to support the annotation of information in clinical documents efficiently and accurately. First, annotations for different classes (word or phrase types) can be tagged according to the type of word using the dictionary function. In addition, it is possible to evaluate and reconcile differences by comparing annotation results between human annotators. Moreover, if the regular expression set for each class is updated during annotation, it is automatically reflected in the new document. The regular expression set created by human annotators is designed such that a word tagged once is automatically labeled in new documents. Results Because ANNO is a Docker-based web application, users can use it freely without being subjected to dependency issues. Human annotators can share their annotation markups as regular expression sets with a dictionary structure, and they can cross-check their annotated corpora with each other. The dictionary-based regular expression sharing function, cross-check function for each annotator, and standardized input (Microsoft Excel) and output (extensible markup language [XML]) formats are the main features of ANNO. Conclusions With the growing need for massively annotated clinical data to support the development of machine learning models, we expect ANNO to be helpful to many researchers.

Keywords