Components in the Pipeline

State-of-the-art scientific instruments and simulations routinely produce massive datasets requiring intensive processing to disclose key features of the artifact or model under study. Scientists commonly call these data-processing pipelines, which are structured according to the pipe and-filter arc...

Full description

Saved in:
Bibliographic Details
Published inIEEE software Vol. 28; no. 3; pp. 34 - 40
Main Authors Gorton, I, Wynne, A, Yan Liu, Jian Yin
Format Journal Article
LanguageEnglish
Published Los Alamitos IEEE 01.05.2011
IEEE Computer Society
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:State-of-the-art scientific instruments and simulations routinely produce massive datasets requiring intensive processing to disclose key features of the artifact or model under study. Scientists commonly call these data-processing pipelines, which are structured according to the pipe and-filter architecture pattern. 1 Different stages typically communicate using files; each stage is an executable program that performs the processing needed at that point in the pipeline.The MeDICi (Middleware for Data-Intensive Computing) Integration Framework supports constructing complex software pipelines from distributed heterogeneous components and controlling qualities of service to meet performance, reliability and communication requirements.
ISSN:0740-7459
1937-4194
DOI:10.1109/MS.2011.23