Components in the Pipeline
State-of-the-art scientific instruments and simulations routinely produce massive datasets requiring intensive processing to disclose key features of the artifact or model under study. Scientists commonly call these data-processing pipelines, which are structured according to the pipe and-filter arc...
Saved in:
Published in | IEEE software Vol. 28; no. 3; pp. 34 - 40 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Los Alamitos
IEEE
01.05.2011
IEEE Computer Society |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | State-of-the-art scientific instruments and simulations routinely produce massive datasets requiring intensive processing to disclose key features of the artifact or model under study. Scientists commonly call these data-processing pipelines, which are structured according to the pipe and-filter architecture pattern. 1 Different stages typically communicate using files; each stage is an executable program that performs the processing needed at that point in the pipeline.The MeDICi (Middleware for Data-Intensive Computing) Integration Framework supports constructing complex software pipelines from distributed heterogeneous components and controlling qualities of service to meet performance, reliability and communication requirements. |
---|---|
ISSN: | 0740-7459 1937-4194 |
DOI: | 10.1109/MS.2011.23 |