WLCG scale testing during CMS data challenges

Oliver Gutsche, Csaba Hajdu

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

The CMS computing model to process and analyze LHC collision data follows a data-location driven approach and is using the WLCG infrastructure to provide access to GRID resources. As a preparation for data taking, CMS tests its computing model during dedicated data challenges. An important part of the challenges is the test of the user analysis which poses a special challenge for the infrastructure with its random distributed access patterns. The CMS Remote Analysis Builder (CRAB) handles all interactions with the WLCG infrastructure transparently for the user. During the 2006 challenge, CMS set its goal to test the infrastructure at a scale of 50,000 user jobs per day using CRAB. Both direct submissions by individual users and automated submissions by robots were used to achieve this goal. A report will be given about the outcome of the user analysis part of the challenge using both the EGEE and OSG parts of the WLCG. In particular, the difference in submission between both GRID middlewares (resource broker vs. direct submission) will be discussed. In the end, an outlook for the 2007 data challenge is given.

Original languageEnglish
Article number062033
JournalJournal of Physics: Conference Series
Volume119
Issue number6
DOIs
Publication statusPublished - Jul 1 2008

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Fingerprint Dive into the research topics of 'WLCG scale testing during CMS data challenges'. Together they form a unique fingerprint.

  • Cite this