Presentation is loading. Please wait.

Presentation is loading. Please wait.

STAR Year 2001 Data and Analysis RHIC performance STAR Trigger Recorded Event Statistics Offline Production pp Running Summary/Outlook Thomas Ullrich,

Similar presentations


Presentation on theme: "STAR Year 2001 Data and Analysis RHIC performance STAR Trigger Recorded Event Statistics Offline Production pp Running Summary/Outlook Thomas Ullrich,"— Presentation transcript:

1 STAR Year 2001 Data and Analysis RHIC performance STAR Trigger Recorded Event Statistics Offline Production pp Running Summary/Outlook Thomas Ullrich, Nov 30, 2001

2 STAR 11/30/2001 2 RHIC – Performance 2001 compared to 2000 Run 2000  s = 130 GeV 6  56 bunches  * ~ 5 m L  2·10 25 (10% design) Run 2001  s = 200 GeV 56 bunches  * ~ 5  3 m L  2·10 26 STAR 2000 ~ 8.5 10 6 sec  L dt  19  year 2000

3 STAR 11/30/2001 3 Event Selection – The STAR Trigger in 2001 Considerably more complex than in 2000:  New detectors and components l TDC in ZDC  estimate of z-vertex [cut |z| < (25) 35 cm] l SVT noise  ~ 2 MB for empty events  low rates at low N ch l L3 trigger (rare probes)  ToF, FTPC, EMC no impact on event selection (almost for EMC)  Optimization of event selection l Cope with 3½ orders of multiplicity: UPC (1)  central (5000) l Maximize “useful” events (in terms of physics analysis) l Trigger setup depends on rates (e.g. L3 only useful at high intensities) l Minimize bias: ZDC z-vertex cut (L0), L3, etc. l Rate studies and tests before implementation of triggers Trigger Board (chair Tonko) addresses the technical questions (see http://www.star.bnl.gov/protected/trgboard/index.html )

4 STAR 11/30/2001 4 STAR Trigger (continued) The setup of the trigger was a learning process! The following trigger sets (aka trigger group or trigger configurations) were used in Au+Au: 15PerCTB 15PerCTBVertex Central centralTopo combinedPedestal Cosmic ctbSum CTBWindow emcPedestal EMCtierTEST hanks hanks_emc L3Central L3Central1200 L3Central1200UPC L3Central600 L3Central600UPC pedTest laser MediumBias MinBias MinBiasPreVertex MinBiasSvtPulser MinBiasVertex MinBiasVertexCTB100SVT MinBiasVertexSVTTest2 MinBiasVtxCTB75noSVT MinBiasVtxCTB75noSVT33k MinBiasVtxMip15noSVT pedAsPhys pedestal pedestal-obs-7-24 physics physics-obs-7-24 productionCentral productionCentral600 productionCentral1200 productionCentralKiller productionCentralNoUPC ProductionMinBias pulser SMDTest SVTRawPulser test testCentral topology trgPed UncalibratedMix Trigger set contains several triggers (trigger mix)

5 STAR 11/30/2001 5 STAR Trigger: two examples Trigger Set: productionCentral hi-mult & ZDC1102 hi-mult1101 Hadronic central1100 TOPO3001 TOPO & ZDC3002 TOPO Efficiency3011 pulserSVTF101 LaserF200 Trigger Set: productionMinBias hi-mult1101 UPC minbias1001 Hadronic minbias1000 pulserSVTF101 LaserF200 In RED: counted for “physics” stats

6 STAR 11/30/2001 6 STAR Trigger: Usage in Analysis  Check and understand the trigger definitions of the type of events you want to study Run log browser: http://online.star.bnl.gov/RunLog2001/  You might need more than one trigger word to your assemble dataset l In few cases the trigger word changed meaning l Do not assume the same trigger word describes the same in all trigger sets l In few cases the vertex cut and other parameters changed in the same trigger set (same configuration name) – this cannot be detected via the offline trigger info  Central triggers with L3 are biased l L3 group has lots of info on this on their web pages, also methods in StEvent  Dedicated web page: http://www.star.bnl.gov/STAR/html/all_l/trigger2001/ l Contains description and details on trigger 2001 l FAQ (describing pitfalls), useful links and more l Please help adding useful info  And for the real critical stuff (cross-sections, multiplicity distributions etc) talk to the experts: Hank, Jeff, Tonko, Zhangbu, Falk

7 STAR 11/30/2001 7 Run 2001: “hadronic central” events Trigger SetField A (+1)Field ½A (+½) Field ½B (-½)Field B (-1) central98,569001,561 productionCentral1,315,4740267,4171,447,612 productionCentral600204,16300181,921 productionCentral120063,5970028,311 Summary1,681,8030267,4171,659,405 Total Sum: 3,608,625 events (72% of the initially planned 5·10 6 ) Used ZDC cut at 10% centrality (720 mbarn)  5  b -1 RHIC:  L dt (hadronic)  34  b -1  max  24 ·10 6 events We recorded 15% of all (10%) central collisions delivered by RHIC (consider we run central only half the time and with |z|<35 cm) Nov 14 – Nov 24: 1.3 ·10 6 events Record day 11/23/2001:211k hadronic central events

8 STAR 11/30/2001 8 Run 2001: “hadronic minimum bias” events Trigger SetField A (+1)Field ½A (+½) Field ½B (-½)Field B (-1) MinBias Corrected (0.7  0.2) 29,244 4,094 46,088 6,452 84,862 11,880 556,260 77,876 MinBiasVertex Corrected (0.7) 0000 0000 825,136 577,595 1,747,058 1,237,724 productionMinBias1,033,580001,768,178 Summary1,037,6746,452589,4753,083,778 Total Sum: 4,717,379 events (94% of the initially planned 5·10 6 ) Note: these are useful events, i.e. |z|<35 cm and vertex can be found Assume  observ  0.94  7.2 b = 6.7 b  0.7  b -1 RHIC:  L dt (hadronic)  34  b -1  max  227 ·10 6 events We recorded 2% of all ‘observable’ hadronic interactions delivered by RHIC (observable means: vertex can be found offline but no |z| range) …. 0.5 ·10 6 min bias events with zero field  and ?? usable events from 19.6 GeV run

9 STAR 11/30/2001 9 Run 2001: Things to keep in mind  Lots of important info is not in the run log (sparse comments) l Volunteers needed: log book  run log  Detectors in data stream l TPC – always l RICH – always l FTPC – mostly l pToF – mostly l SVT – with interruptions l EMC – only towards the end l SMD – no l FPD – rarely  Detector hiccups l TPC: § # of dead channels varies in time (RDO boards) l FTPC §sometimes only FTPC-West §sometimes with missing sector in FTPC-East l SVT §see Rene’s talk earlier l EMC §see Alex’s talk earlier

10 STAR 11/30/2001 10 Example: Bad Sectors 230 < day < 253 RDO 21-3 bad 254 < day < 266 RDO 9-3 and 21-3 bad 266 < day RDO 9-3 bad Run 2266012 Art Poskanzer

11 STAR 11/30/2001 11 Fast Offline Purpose: complete reconstruction of parts of the recorded events processed in a timely fashion (hours) for QA  Setup, maintained, petted, and cursed by Jerome  Use 7 nodes (14 CPUs) out of 124 CRS nodes  Use up to 60% of reserved 1 TB disk (purged frequently)  Of 12 · 10 6 recorded events, 11% processed over the entire time  Not always timely (days) depending on RCF l Long term: use online cluster sitting in DAQ-room  Fast Offline runs with the “dev” version l Latest status but also with the most recent bug Please Note: results from fast offline are NOT publishable (no stable chain, bugs in chain, calibration pass not reliable)

12 STAR 11/30/2001 12 Offline Test Runs Short (test) productions as requested by detector subgroups via period-coordinators or SAC (e.g. for ToF, L3, EMC, FTPC, calibration studies, 19.6 GeV run) Managed by Lidia and Jerome Competes with “standard” production on 117 (of 124) CRS nodes So far ~ 2M events processed Sometimes run in “dev”, sometimes in “pro” depending on needs Parts of Jerome’s TODO List Please Note: results from offline test runs are NOT publishable (no stable chain, bugs in chain, calibration pass not reliable)

13 STAR 11/30/2001 13 Year 2001 Au+Au Reconstruction  Current ‘official’ production version is P01gk  Original plan: produce only parts and have people check data quality  Reconstruction of TPC, RICH, L3 only (ToF raw data on DST)  So far: ~ 10 6 events minimum bias  Up to now no requests from PWGs for more …  Next round: l might be the one for QM2001 in Nantes l aim for complete production (i.e. all y2001 data) l include FTPC (at least let’s try hard) l require RICH to provide PID which can be used directly by any user (StRichPidTraits) l possibly new corrections for TPC (see Jamies talk) l might be the last big production before ITTF comes in (see Mikes talk)  We need to fix TRS (“loving owner” issue) a.s.a.p  Still problems with disks at RCF, need solution soon  ITTF evaluation will need some resources  Near future: need coordination of all slow simulators (simulation leader)

14 STAR 11/30/2001 14 Preparation for pp Analysis So far: Reconstruction optimized for heavy- ions making use of:  very precise vertex  lots of tracks/event for calibration pp spin program focus on the lower end in N ch  worse vertex resolution  many small events (I/O vs CPU)  need to keep track of polarization  handling pile-up events  new trigger configurations pp software workshop on 11/19 Event vertex resolution  ppLMV has a sigma of about 2 cm for x,y,z  Need to do much better (new code, EVR?)  How stable is beam spot in x,y ? pp-production chain needs to be finalized  Currently big memory leaks that kill the jobs after ~200 events Trigger info.  trigger scalers info in Db, work in progress Drift velocity calibration  Present algorithm is not useful for pp  rely more on laser runs All minutes and some (important) slides posted on reco pages: http://www.star.bnl.gov/STARAFS/comp/reco/

15 STAR 11/30/2001 15 Summary and Outlook We have plenty of data on tape  min bias: y2001  10  y2000  central:y2001  7  y2000 Reach new physics  High-pt l h spectra and anisotropy out to 12 GeV/c,  (?), ratios at high-pt (RICH)  Strangeness l ,  (width, mass-shifts), higher reach in p T for (multi-) strange baryons,  (1520), K*(1430),  (1385)  HBT l 3D K-K, K-  w/ greater resolution, K-p,  -p, K 0 s -K 0 s, p t dependence of R(  )  Spectra l ,  He 4, particle ration  EbyE l , K* flow, K/   … and much more … Lots of work to do to get analysis chain fully functional  Need more help on: l TRS l Embedding production l Code development and maintenance l pp chain First steps for analysis:  normalized multiplicity distribution  Define common multiplicity classes Future  ITTF  Reshape chain (tables -> StEvent)  DAQ100


Download ppt "STAR Year 2001 Data and Analysis RHIC performance STAR Trigger Recorded Event Statistics Offline Production pp Running Summary/Outlook Thomas Ullrich,"

Similar presentations


Ads by Google