Computational and Structural Biotechnology Journal (Jan 2021)

Data processing workflow for large-scale immune monitoring studies by mass cytometry

  • Paulina Rybakowska,
  • Sofie Van Gassen,
  • Katrien Quintelier,
  • Yvan Saeys,
  • Marta E. Alarcón-Riquelme,
  • Concepción Marañón

Journal volume & issue
Vol. 19
pp. 3160 – 3175

Abstract

Read online

Mass cytometry is a powerful tool for deep immune monitoring studies. To ensure maximal data quality, a careful experimental and analytical design is required. However even in well-controlled experiments variability caused by either operator or instrument can introduce artifacts that need to be corrected or removed from the data. Here we present a data processing pipeline, which ensures the minimization of experimental artifacts and batch effects, while improving data quality. Data preprocessing and quality controls are carried out using an R pipeline and packages like CATALYST for bead-normalization and debarcoding, flowAI and flowCut for signal anomaly cleaning, AOF for files quality control, flowClean and flowDensity for gating, CytoNorm for batch normalization and FlowSOM and UMAP for data exploration. As proper experimental design is key in obtaining good quality events, we also include the sample processing protocol. Both, analysis and experimental pipelines are easy to scale-up, thus the workflow presented here is particularly suitable for large-scale, multicenter, multibatch and retrospective studies.

Keywords