Data processing model for the CDF experiment
The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialised physics interests. The design of the processing control system makes strict requirements on bookkeeping records, which trace the status of data flies and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required. © 2006 IEEE.
Antos, J; Babik, M; Benjamin, D; Cabrera, S; Chan, AW; Chen, YC; Coca, M; Cooper, B; Farrington, S; Genser, K; Hatakeyama, K; Hou, S; Hsieh, TL; Jayatilaka, B; Jun, SY; Kotwal, AV; Kraan, AC; Lysak, R; Mandrichenko, IV; Murat, P; Robson, A; Savard, P; Siket, M; Stelzer, B; Syu, J; Teng, PK; Timm, SC; Tomura, T; Vataga, E; Wolbers, SA
Volume / Issue
Start / End Page
International Standard Serial Number (ISSN)
Digital Object Identifier (DOI)