Data processing model for the CDF experiment
IEEE Trans.Nucl.Sci.53:2897-2906,2006 The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialized physics datasets....
Saved in:
Main Authors | , , , , , , , , , , , , , , , , , , , , , , , , , , , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
05.06.2006
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | IEEE Trans.Nucl.Sci.53:2897-2906,2006 The data processing model for the CDF experiment is described. Data
processing reconstructs events from parallel data streams taken with different
combinations of physics event triggers and further splits the events into
datasets of specialized physics datasets. The design of the processing control
system faces strict requirements on bookkeeping records, which trace the status
of data files and event contents during processing and storage. The computing
architecture was updated to meet the mass data flow of the Run II data
collection, recently upgraded to a maximum rate of 40 MByte/sec. The data
processing facility consists of a large cluster of Linux computers with data
movement managed by the CDF data handling system to a multi-petaByte Enstore
tape library. The latest processing cycle has achieved a stable speed of 35
MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and
data-handling capacity as required. |
---|---|
Bibliography: | FERMILAB-PUB-06-169-CD-E |
DOI: | 10.48550/arxiv.physics/0606042 |