Data production models for the CDF experiment

The data production for the CDF experiment is conducted on a large Linux PC farm designed to meet the needs of data collection at a maximum rate of 40 MByte/sec. We present two data production models that exploits advances in computing and communication technology. The first production farm is a cen...

Full description

Saved in:
Bibliographic Details
Published inEighth International Conference on High-Performance Computing in Asia-Pacific Region (HPCASIA'05) pp. 7 pp. - 145
Main Authors Antos, J., Babik, M., Benjamin, D., Cabrera, S., Chan, A.W., Chen, Y.C., Coca, M., Cooper, B., Genser, K., Hatakeyama, K., Hou, S., Hsieh, T.L., Jayatilaka, B., Kraan, A.C., Lysak, R., Mandrichenko, I.V., Robson, A., Siket, M., Stelzer, B., Syu, J., Teng, P.K., Timm, S.C., Tomura, T., Vataga, E., Wolbers, S.A., Yeh, P.
Format Conference Proceeding
LanguageEnglish
Published IEEE 2005
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The data production for the CDF experiment is conducted on a large Linux PC farm designed to meet the needs of data collection at a maximum rate of 40 MByte/sec. We present two data production models that exploits advances in computing and communication technology. The first production farm is a centralized system that has achieved a stable data processing rate of approximately 2 TByte per day. The recently upgraded farm is migrated to the SAM (Sequential Access to data via Metadata) data handling system. The software and hardware of the CDF production farms has been successful in providing large computing and data throughput capacity to the experiment
ISBN:9780769524863
0769524869
DOI:10.1109/HPCASIA.2005.30