Data processing model for the CDF experiment

Journal Article (Journal Article)

The data processing model for the CDF experiment is described. Data processing reconstructs events from parallel data streams taken with different combinations of physics event triggers and further splits the events into datasets of specialised physics interests. The design of the processing control system makes strict requirements on bookkeeping records, which trace the status of data flies and event contents during processing and storage. The computing architecture was updated to meet the mass data flow of the Run II data collection, recently upgraded to a maximum rate of 40 MByte/sec. The data processing facility consists of a large cluster of Linux computers with data movement managed by the CDF data handling system to a multi-petaByte Enstore tape library. The latest processing cycle has achieved a stable speed of 35 MByte/sec (3 TByte/day). It can be readily scaled by increasing CPU and data-handling capacity as required. © 2006 IEEE.

Full Text

Duke Authors

Cited Authors

  • Antos, J; Babik, M; Benjamin, D; Cabrera, S; Chan, AW; Chen, YC; Coca, M; Cooper, B; Farrington, S; Genser, K; Hatakeyama, K; Hou, S; Hsieh, TL; Jayatilaka, B; Jun, SY; Kotwal, AV; Kraan, AC; Lysak, R; Mandrichenko, IV; Murat, P; Robson, A; Savard, P; Siket, M; Stelzer, B; Syu, J; Teng, PK; Timm, SC; Tomura, T; Vataga, E; Wolbers, SA

Published Date

  • October 1, 2006

Published In

Volume / Issue

  • 53 / 5

Start / End Page

  • 2897 - 2906

International Standard Serial Number (ISSN)

  • 0018-9499

Digital Object Identifier (DOI)

  • 10.1109/TNS.2006.881908

Citation Source

  • Scopus