Data processing workflow for large-scale immune monitoring studies by mass cytometry
[Display omitted] Mass cytometry is a powerful tool for deep immune monitoring studies. To ensure maximal data quality, a careful experimental and analytical design is required. However even in well-controlled experiments variability caused by either operator or instrument can introduce artifacts th...
Saved in:
Published in | Computational and structural biotechnology journal Vol. 19; pp. 3160 - 3175 |
---|---|
Main Authors | , , , , , |
Format | Journal Article |
Language | English |
Published |
Netherlands
Elsevier B.V
01.01.2021
Research Network of Computational and Structural Biotechnology Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | [Display omitted]
Mass cytometry is a powerful tool for deep immune monitoring studies. To ensure maximal data quality, a careful experimental and analytical design is required. However even in well-controlled experiments variability caused by either operator or instrument can introduce artifacts that need to be corrected or removed from the data. Here we present a data processing pipeline, which ensures the minimization of experimental artifacts and batch effects, while improving data quality. Data preprocessing and quality controls are carried out using an R pipeline and packages like CATALYST for bead-normalization and debarcoding, flowAI and flowCut for signal anomaly cleaning, AOF for files quality control, flowClean and flowDensity for gating, CytoNorm for batch normalization and FlowSOM and UMAP for data exploration. As proper experimental design is key in obtaining good quality events, we also include the sample processing protocol. Both, analysis and experimental pipelines are easy to scale-up, thus the workflow presented here is particularly suitable for large-scale, multicenter, multibatch and retrospective studies. |
---|---|
ISSN: | 2001-0370 2001-0370 |
DOI: | 10.1016/j.csbj.2021.05.032 |