DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software...

64
DOE/ER/40712 RELATIVISTIC HEAVY ION EXPERIMENTAL PHYSICS THREE YEAR RENEWAL PROPOSAL MARCH 2010 – MARCH 2013 Department of Physics and Astronomy Vanderbilt University Nashville, Tennessee 37235 Prepared for the United States Department of Energy Under Grant No. DE-FG05-92ER40712 ABSTRACT ( 100 WORDS ) GOES HERE. 1

Transcript of DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software...

Page 1: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

DOE/ER/40712

RELATIVISTIC HEAVY ION EXPERIMENTAL PHYSICS

THREE YEAR RENEWAL PROPOSAL

MARCH 2010 – MARCH 2013

Department of Physics and Astronomy

Vanderbilt University

Nashville, Tennessee 37235

Prepared for the United States Department of Energy Under

Grant No. DE-FG05-92ER40712

ABSTRACT ( 100 WORDS ) GOES HERE.

1

Page 2: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

1. INTRODUCTION

1.1. SUMMARY OF ACHIEVEMENTS IN THE LAST 3 YEARS

The Vanderbilt group leaders are longstanding members of the PHENIX collaboration at RHIC and have constructed major components of the baseline hardware and software systems. The senior members of the group were all active in the analysis teams which produced the first PHENIX results and publications indicating that a new state of matter had been seen at RHIC. Over the past three years, the present group at Vanderbilt has achieved several milestones related to their work on PHENIX:

We made major contributions to seven PHENIX peer-reviewed publications, including papers on elliptic flow of inclusive hadrons, and identified charged hadron production and suppression , and several additional publications are in preparation as described below.

In 2007-2008 we led the successful commissioning of the Time-of-Flight West detector (TOF.W) for PHENIX and the development of the associated calibration, monitoring, simulation and analysis software. The detector achieved 70 ps timing resolution, exceeding its design capabilities. This superb performance has enabled us to make unique measurements of identified π±, p, p, d, d, Λ, Λ spectra and on the anisotropic flow of inclusive and identified hadrons, to high p

T . The results were presented at the Quark Matter 2008 conference in three contributed

talks and one poster, by Dr. Huang, Dr. Issah, Mr. Valle and Mr. Belmont; in the Quark Matter 2009 conference we had a contributed talk by Mr. Belmont and a poster by Mr. Roach. Results have also been presented at the Nuclear Dynamics Winter Workshop 2008, Hot Quarks 2008, Strange Quark Matter 2008, the 2008 Division of Particles and Fields annual meeting, the European Physical Society –HEP 2009 annual meeting, and the Strange Quark Matter 2009 conference.

We continued the successful operation and maintenance of the PHENIX Pad Chambers and made several upgrades to the associated calibration and monitoring software. In 2009 we successfully performed the technically difficult task of replacing one of the PC1 modules.

We produced simulated data in support of data analyses for the Quark Matter 2008 and 2009 conferences using the Vanderbilt ACCRE computing facility1. In 2007 we reconstructed 30 TB of Au+Au minimum bias data in near real-time, receiving the raw data via GridFTP from the

1 Advanced Computing Center for Research and Education, http://www.accre.vanderbilt.edu. The ACCRE facility, presently with 1500 CPUs, was funded in 2004 with an $8.7M capital grant from the Vanderbilt University Provost's office. Since then, there as been approximately $3M in additional external funding, and the facility is expected to grow to at least 2700 CPUs in the next 12 months.

2

Julia Velkovska, 10/11/09,
I replaced “charged” with “identified” because we were on the pi0 paper, etc. It is more inclusive.
Vicki Greene, 10/10/09,
I added this bullet on our publications. Feel free to remove it or to edit it if you wish.
Page 3: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

PHENIX counting house, and then transferring 20 TB of output to RACF for use by PHENIX analyzers while the Run 7 was still in progress.

Two of our graduate students, Mr. Belmont and Mr. Roach served as deputy production managers for the PHENIX Run 7 real data production. Mr. Belmont continued in this role for Run 8 data production.

Members of the group have led as conveners the PHENIX hadron physics working group

(Prof. Greene 2007-2009 and Dr Huang: 2009 – ongoing), and the hard-scattering physics working group ( Prof. Velkovska : 2009 – ongoing).

In 2006, the Vanderbilt group formally joined the CMS heavy ion collaboration and DOE approved two of our faculty members for MoA status starting in January 2010. This approval recognizes the Vanderbilt group as taking leadership roles in three important areas that will prepare the CMS detector to take, analyze and publish the heavy ion collision data at a new regime of ultra-relativistic energies. Our CMS responsibilities include:

The physics of correlations and flow: Prof. Velkovska is the head of the Correlations and Flow Physics Interest Group in CMS-HI, and coordinates the preparation of several data analyses which are expected to produce the first physics publications from the 2010 LHC HI data set.

Offline computing: On behalf of the US-CMS-HI institutions, Prof. Maguire proposed to DOE to locate a computing center at the Vanderbilt ACCRE facility that will perform raw data reconstruction and data analyses. A series of software and network test were carried out to assure that we are adequately prepared for such a task.

Data Quality Monitoring: Prof. Velkovska spent a year-long sabbatical leave at CERN in the 2008-2009 academic year as part of her responsibility to develop the DQM software needed to ensure data quality during the HI running at the LHC.

1.2. OVERVIEW OF PROPOSED WORK

For the RHIC effort, we are finalizing the results and preparing the text for several papers: v2 and v4 of identified charged hadrons, source evolution from d and d measurements, and spectra and flow of π±, p, p, Λ, Λ identified to high pT. We also plan to analyze the identified hadron spectra from 200 GeV d+Au and pp collisions (Run 8, Run 9) and the Au+Au data from the low energy scan planned for Run 10. We will continue our support for the TOF.W and the Pad Chamber detector subsystems.

For CMS, we propose to carry out the responsibilities in DQM (Velkovska) and offline computing (Maguire) that we assumed during the last funding cycle. Using Vanderbilt

3

Page 4: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

University research funds, Professor Greene will supervise the planning and construction of a ``CMS Center” mini-control room in the Physics building at Vanderbilt which will permit live-time connectivity with the main control room at CERN as we perform our CMS-HI missions for DQM and for offline data processing. We will also work to develop data analyses specific to the Pb+Pb HI program and the simulations needed for these analyses with an emphasis on the physics of correlations and flow, where we have assumed a leadership role. The first Pb+Pb run is scheduled to occur in November 2010 and is expected to last four weeks. The projected data sample from this first run is ~24 M minimum bias events recorded with full azimuthal acceptance and over a broad rapidity range. This data sample will allow for a fairly complete characterization of the global properties of the system and will be a significant step towards studies involving hard probes. Beam operations at the LHC, including one month of HI running, will resume in 2012, using the High Level Trigger to enhance the collection of rare signal events.

1.3. LIST OF PERSONNEL

To carry out the proposed research program we expect to operate at our usual strength of three faculty members, three research associates and four graduate students. The tenured faculty members are: Maguire (80% CMS, 20% PHENIX), Greene (50%, 50%), and Velkovska (50%, 50%). The current research associates are: Huang (100% PHENIX), Issah (50%, 50%), and the newest addition to our group, joining in November 2009 – Pelin Kurt (100% CMS). The current RA graduate students are: Belmont and Roach (100% PHENIX, 4th year) and Appelt (100% CMS, 2nd year). Belmont and Roach should graduate in the next grant cycle, and we intend to have their replacements continue to work in PHENIX. We also have two graduate students (Snook 2nd year and Tuo 1st year) currently working in the group, but not on RA status. We expect to bring one of those two students onto RA status in the first year of the next grant cycle, to work 100% on CMS.

2. PAST ACCOMPLISHMENTS AND WORK IN PROGRESS

2.1. PHENIX

Data from the Relativistic Heavy Ion Collider suggest that heavy ion collisions at √sNN

= 200 GeV form an equilibrated non-hadronic system. There is strong evidence that this dense medium is almost opaque to fast partons and is highly interactive, perhaps best described as a quark-gluon liquid with very small viscosity. Presently, the RHIC experimental program focuses on the detailed characterization of the produced medium and its properties. Key questions related to the mechanisms responsible for the rapid thermalization in the produced system, the equation of state and the early dynamics, dissipative effects and their role in the system evolution, the effective degrees of freedom, the interactions between them, and the hadronization mechanism remain to be answered. Many of these questions can be addressed by the study of identified particle spectra and elliptic flow. The RHI group at Vanderbilt has developed expertise in the analysis of identified π, Κ, p, d, φ, and Λ. Within PHENIX, we have actively contributed to establishing reliable methods to extract spectral shapes,

4

Page 5: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

integrated yields, elliptic flow and analysis of the data using hydrodynamics parameterizations. Members of the group have played a leading role in all PHENIX publications related to these topics. In we summarize the physics analysis topics related to our RHIC work which were part of our previous proposal and their present status. The key results from each analysis are discussed in Sections 2.1.1 through 2.1.6 in the order of appearance in .

Analysis topic Data set Status in September 2009 People

Elliptic flow of inclusive charged hadrons as a function of centrality and pseudo-rapidity using cumulant method in comparison to results from reaction plane method.

Run 4: Au+Au √sNN = 200 GeV

Talks at QM’08, DPF’09.

Published in Phys. Rev. C 80, 024909 (2009)

Issah

Velkovska

Greene

v2 and v4 of identified hadrons using upgrade TOF.West and reaction plane detector

Run 7: Au+Au √sNN = 200 GeV

Talks at QM’08, SQM’08

Analysis complete. Paper preparation group formed. Draft expected in collaboration in December 2009.

Huang,

Velkovska

High-pT v2 of identified π,K,p using TOF.West, aerogel and reaction plane detectors. Breaking of the constituent quark scaling.

Run 7: Au+Au √sNN = 200 GeV

Merged with the previous topic into one strongerlonger paper.

Huang,

Velkovska

High-pT spectra, ratios and nuclear modification factors for deuterons and anti-deuterons using TOF.West

Run 7: Au+Au √sNN = 200 GeV

Talks at QM’08 and Hot Quarks 08

Paper preparation group formation in progress. First draft expected in collaboration by Feb 2010.

Valle,

Belmont,

Velkovska

High-pT spectra, ratios and nuclear modification factors for anti-

Run 7: Au+Au √sNN = 200 GeV

Poster at QM’09

Simulations using HYDJET performed.

Roach, Maguire,

5

Page 6: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Preliminary results to be obtained by March 2010.

Greene

High-pT spectra, ratios and RAA , RdA for π,K,p using TOF.West and aerogel

√sNN = 200 GeV

Run 7: Au+Au Run 8: d+Au

Run 8 and 9: p+p

Spectra in Au+Au, ratios and Rcp presented at QM’09

RAA and RdA to be completed in next grant cycle.

Belmont, Velkovska

Φ meson RAA and RdAu out to high pT

√sNN = 200 GeV

Run 4: Au+Au Run 3: d+Au Run 5: pp and Cu+Cu

Paper in collaboration review.

Velkovska

Table 1 Physics projects in PHENIX, data sets used for the analysis, present status and group personnel involved.

2.1.1. ELLIPTIC FLOW OF INCLUSIVE CHARGED HADRONS

Elliptic flow at low transverse momentum is a collective effect governed by the pressure gradients in the hydrodynamically expanding system. At high-p

T, where the particle production

primarily originates from hard-scatterings between the incoming partons, the azimuthal asymmetry is generated by jet-quenching, i.e. – there is stronger absorption along the long axis of the of the reaction zone than along the short axis. It is important to disentangle these two contributions to elliptic flow, but this is not straightforward experimentally. One possible experimental handle is to measure the azimuthal asymmetry using two different methods that have different sensitivity to the “flow” generated by jets. Comparisons between the azimuthal anisotropy obtained from the reaction plane and cumulant methods were presented at the QM 2008 and at DPF 2009 conferences by Dr. Issah. During the last grant period we have completed this analysis and published an archival paper on the subject.2 . The paper contains a detailed description of the analysis methods as well as numerous physics results presented in 24 figures. The main results from this paper are shown below in Figure 1. The elliptic flow measured with respect to the reaction plane determined at beam rapidity [ZDC-SMD], or forward rapidity [BBC] and 2-particle cumulant metho d agrees out to p

T ~ 3 GeV/c and

differs at large pT where the jet contribution is significant.

2 Phys. Rev. C 80, 024909 (2009)

6

Page 7: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Figure 1 Charged hadron v2 as a function of p

T for 0 - 60 % centrality bins in Au+Au collisions at √s

NN = 200 GeV

from the two-particle cumulant method (open circles), the BBC event plane (open diamonds) and the ZDC-SMD event plane (solid circles). The differences can be attributed to non-flow effects.

2.1.2.IDENTIFIED CHARGED HADRON V2 AND V

4 MEASUREMENTS IN AU + AU COLLISIONS

AT SQRT(S) = 200 GEV2.1.3.

In order to connect the thermodynamic properties of QCD matter described by lattice calculations to experiment, one needs to take into account the dynamical effects and the finite size of the nuclear collision system. Since these effects are not amenable to first principles calculations, the comparison with the data relies on phenomenology. The phenomenological descriptions use microscopic transport models to describe the initial state of the collision and macroscopic (hydrodynamics) transport to model the collective dynamics. The hydrodynamics calculations assume that local thermal equilibrium is achieved and maintained at all stages, and that the system evolves according to an equation of state ( EoS) , which is an input parameter. The validity of the model calculations depends critically on these inputs. The successful description of the elliptic flow data has led to the conclusions that ideal non-viscous hydrodynamics is applicable to describe the system evolution at RHIC ,RHIC that, that thermalization is achieved on a surprisingly short timescale (~ 0.6 fm/c), and that a QGP EoS is needed to describe the data.

Establishing thermalization is also an important pre-requisite in order for a state of matter to be well- defined. Thus, an its experimental investigation of th of thermalization e degree of thermalization is essential for our understanding of the high energy-density QCD matter. The thermalization mechanism itself cannot be addressed by hydrodynamic calculations; one has to invoke a microscopic description of the system. Experimentally, but one can study observables that

7

Page 8: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

are sensitive to the degree of thermalization. The ratio of different order harmonics of the azimuthal distribution of particle emission v

4/(v

2)2 carries such information. Ideal hydrodynamics predicts that

this ratio reaches a minimum of ~ ½ , when the hydrodynamics limit is reached. For Run7 of RHIC, the PHENIX experiment was upgraded with an improved reaction plane detector which obtained a resolution of 70% resolution. With the of improved PID from the TOF.W est and the aerogel detectors, and the large data sample (over 5.5 billion minimum bias 200 GeV Au+Au events) we have the opportunity were able to extend previous identified hadron v

2 and v

4 measurements into a

region of transverse momenta which had s not yet yet been explored. . Dr. Huang presented these results at the QM 2008 , SQM 2008 and SQM 2009 conferences. We find that for ,K,p the ratio v

4/(v

2)2 has a value of ~0.8 (Figure 2), which is different from the simple hydrodynamics

prediction. To reconcile the hydrodynamic model with the measurement, one needs to include the Including assumption of partonic flow and hadronization by quark recombination is consistent with the measurement.. This result support s several important findings at RHIC: the rapid thermalization of the system, the small viscosit y , and the partonic nature of the produced liquid. Since our v

2

measurements extend to high - pT, we are able to test the range of validity of the quark recombination

models. We have discovered the breaking of the constituent quark scaling above a transverse kinetic energy per quark of about 1 GeV. These two important results are the topic of a PHENIX paper currently in preparation .. In the past year we have completed a detailed evaluation of systematic uncertainties and have confirmed our preliminary results. At present, the analysis is completed and the paper is being drafted with expected release for collaboration review before the end of the calendar year.

2.1.4. HIGH-PT SPECTRA, RATIOS AND NUCLEAR MODIFICATION FACTORS FOR

DEUTERONS AND ANTI-DEUTERONS MEASURED USING TOF.WEST We have studied deuteron and anti-deuteron production out to high p

T using the TOF.W West

detector. This work formed the Ph.D. thesis of graduate student Hugo Valle. The main results of the thesis are the study of the spectra, the coalescence parameter B

2, the nuclear modification factor R

CP,

the mean pT and integrated yields, and a comparison of the spectra with the hydrodynamics model.

We found that the nuclear modification factor (shown in Figure 4) for deuterons exceeds the ratio expected from hard-scattering by a significant amount. This behavior can be understood in terms of the recombination of neutrons and protons whic h is has a much higher probability in central AA collisions compared to peripheral collisions. These results were presented in an oral presentation and a poster at the Quark Matter 2008 conference3.

3 J. Physics G: Nucl. Part. Phys. 35 (2008) 104048

8

Page 9: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Figure 2 Hexadecapole moment (v4) for π, K,p measured as a function of p

T, compared to squared elliptic flow (v

2)2

scaled by a factor justified by expectations for partonic flow. The ratio v4/(v

2)2 is shown in the lower panel.

Figure 3 Elliptic flow for identified ,K,p as a function of pT (left) and as a function of transverse kinetic energy

per quark (right) measured in 200 GeV AuAu Au+Au collisions for centrality 0-60%. The quark number scaling is violated for KE

T /nq > 1 GeV.

9

Page 10: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Figure 4 RCP

for deuterons and anti-deuterons. Figure 5 Transverse momentum spectra for ,K,p d measured in 0-10% Au+Au collisions at √s

NN = 200 GeV compared to

hydrodynamics model ( lines). The deuteron spectra require a lower flow velocity and higher temperature at freeze-out.

The measurement of the coalescence parameter B2 as a function of centrality and p

T allows

us to extract the source radius at freeze- out. We find that the radius increases linearly with Npart

1/3 which indicates that the freeze-out happens at constant particle density. These results are consistent with the radii obtained from HBT measurements involving identified pions.

As part of our comprehensive study of deuteron and anti-deuteron production, we have compared the p

T spectra to a hydrodynamics parameterization (blast-wave model). The results of this

study are quite surprising (Figure 5). We find that the deuterons and anti-deuterons are well described by a hydrodynamics prediction; however the spectra require freeze-out parameters which are different from those of the produced particles (, K, p). The deuterons appear to freeze-out earlier than the produced hadrons with higher temperature and lower transverse flow velocity. The deuteron spectra are very sensitive to the freeze-out dynamics and a detailed theoretical modeling of the emission process is required in order to reproduce the all available hadron data simultaneously. This is one of the outstanding deficiencies in hydrodynamics models which has ve been previously revealed in the comparison with HBT data. Our result provides an additional constraint to the theoretical description of the emission process, the equation of state, and the order of the phase transition.

The final analysis checks that are needed in order to proceed with a publication are related to occupancy effects in the TOF.W est detector for the most central Au+Au collisions. To correct for these, we have deployed new TOF.West software that embeds simulated tracks into real events with different centrality and subsequently evaluates the reconstruction efficiency as a function of centrality.

10

Page 11: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

With this final milestone achieved, we are now focusing on the preparation of a comprehensive paper on deuteron and anti-deuteron production. This paper will be based on the results from the Ph.D. thesis of Hugo Valle and results presented by Mr Belmont at the Hot Quarks 2008 workshop4 .

2.1.5. HIGH-PT SPECTRA, RATIOS AND NUCLEAR MODIFICATION FACTORS FOR, , P,

AND ANTI- AND USING TOF.WEST AND AEROGEL.

A rich set of data on identified hadron spectra is available from RHIC for different collision systems and center of mass energies. The nuclear modification factors, RAA , have been measured out to p

T = 20 GeV/c for π0 , but only to p

T ≤ 10 GeV/c for other identified hadrons. The suppression of leading hadrons at √s

NN = 200, 130, and down to 62 GeV is well established experimentally. The

baryon/meson differences in the suppression that were first observed for protons, anti-protons and pions have been confirmed by systematic studies involving a variety of particle species. Heavy mesons (η, φ, K*) were found to follow the π0 suppression, while baryons (Λ, Ξ, Ω) are less suppressed at intermediate p

T. These findings rule out models in which the different behavior is

expected based on the particle mass (i.e. baryon enhancement due to hydrodynamic flow or formation time effects that delay hadronization for lighter particles). There are several reasons why new precise data on identified hadron spectra and nuclear modification factors are needed both at intermediate and high p

T. The production mechanism of baryons and mesons in the intermediate p

T

region has been attributed to recombination of quarks from the QGP. However, there are strong jet-like correlations observed with identified leading protons and anti-protons originating from 2.5 < p

T <

4 GeV/c where baryon excess is observed in the single particle measurements5. Thus, recombination has to include fragmentation partons at some level. To discriminate between different models, it is necessary to consider data for different particle species and to cover both intermediate p

T, where

fragmentation partons play a role but probably are not the dominant contributor to the yields, and high-p

T where fragmentation dominates. In the last grant period we have been working on the study

the nuclear modification factors for protons, anti-protons, charged pions, deuterons, Λ, anti-Λ with high statistics data sample obtained from Run7 of RHIC. Such measurements fit naturally with our responsibilities for the TOF.W detector and our previous physics analysis expertise. At the Quark Matter 2009 conference we presented new results on and p spectra and nuclear modification factors

4 Eur. Phys. J. C (2009) 62 : 243-248 5 A. Adare et al., Phys.Lett.B649:359-369,2007

11

Page 12: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

(a parallel talk by Mr. Belmont) 6 , and first results on and anti- in a poster by Mr. Roach. Below, in Figure 6 , we show the preliminary results on (anti)proton-to-pio n ratios in the most central collisions in comparison to previously published results7. The benefit of the improved PID from TOF.W is evident. In Figure 7 we show the central-to-peripheral ratios (R

CP) for (anti)protons and

charged pions from the same analysis, as well as published π0 results. We observe that the baryon/meson difference becomes smaller at high p

T, as expected by recombination models.

The analysis of Lambda baryon production is still work in progress. We have obtained clean Λ and Λ mass peaks out to p

T = 9 GeV/c in minimum bias and semi-peripheral Au+Au collisions.

(see e.g. Figure 8), but for the most central collisions we observed secondary peaks in the invariant mass distributions for p

T > 6 GeV/c. To understand the nature of these peaks and extend the Λ PID to

the highest possible transverse momentum, we have developed code to integrate a new Monte Carlo event generator, HYDJET++, into the PHENIX simulations framework, PISA. HYDJET++, which has also been designed to work at LHC energie s (see section 2.2 ) , combines both soft particle production and elliptic flow with quenched hard scattering simulation (PYQUEN). These simulation studies have helped us identify the problem that causes the secondary peaks in high-multiplicity Au+Au events. They stem form wrongly associated momentum and time-of-flight information when both products of the Lambda decay impinge on the same TOF.W chamber8. A correction procedure is presently being developed. This study will allow us to reliably extract the Λ baryon yields up to p

T ≥ 9

GeV/c as demonstrated in Figure 9.

Figure 6 Ratios of the yields of protons to positive pions and antiprotons to negative pions as a function of transverse momentum, 0-10% centrality. The previously published results are shown with open symbols.

Figure 7 Ratio of yield in the most central collisions to the yield in the most peripheral collisions for positive and negative pions, protons, and antiprotons, with comparisons to previously published data

6 arXiv: http://arxiv.org/abs/0907.4828 7 Phys. Rev. Lett. 91, 172301 (2003)8 The TOF.W detector consists of 128 multi-gap resistive plate chambers, each of which is 37 x 12 cm2

12

Page 13: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Figure 8 Invariant mass distribution of protons identified in TOF. West and negative tracks (mostly pions). The blue histogram represents the combinatorial background obtained from an event-mixing technique.

Figure 9 Raw Λ as a function of Λ pT. This data is not yet

corrected for reconstruction efficiency. The presented raw yields provide an estimate for the statistical reach of the final measurement.

2.1.6. NUCLEAR EFFECTS ON Φ-MESON PRODUCTION IN D+AU AND AU+AU COLLISIONS AT √S

NN = 200 GEV

Our interests in the particle species dependence of hadron suppression at intermediate and high-p

T have led one of us (Prof. Velkovska) to participate in a paper preparation group for the

nuclear effects in φ-meson production d+Au, Au+Au and Cu+Cu collisions at √sNN

= 200 GeV, revealed by comparison to production in pp collisions. Although the spectra analyses were performed elsewhere, Prof. Velkovska made significant contributions to the analysis of the nuclear modification factors, the interpretation of the results and the preparation of paper manuscript. Our group has previously analyzed and published9 results on φ-meson production from Run 2. These results have been extended to much higher transverse momentum ( pT

≤ 7 GeV/c) using new analysis techniques

and the higher integrated luminosity samples from Runs 3, 4 , and 5. The production of the φ- mesons is suppressed in central Au+Au and Cu+Cu collisions as compared to p+p results scaled by the number of nucleon-nucleon binary collisions. The behavior is qualitatively similar to that of other mesons (π 0 and η ), but a quantitative comparison shows that the φ-mesons are less suppressed th a n pions. The result can be understood in terms of recombination model s 10 which include both thermal

9 Phys. Rev. C 72, 014903 (2005) 10 Production of strange particles at intermediate p T

at RHIC, Rudolph C. Hwa , C. B. Yang , arXiv:nucl-th/0602024

13

45-65 % centrality. Au+Au √s

NN = 200 GeV, Run 7

Page 14: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

and shower parton recombination. The particle species dependent suppression is the key result in the paper, which is currently in collaboration review.

2.1.7. TIME-OF-FLIGHT WEST AND PAD CHAMBERS DETECTOR SUPPOIRT IN PHENIX

The physics program of the Vanderbilt group is focused on analyses that require particle identification (PID) and tracking. The group has contributed two detector subsystems in PHENIX : the Pad Chambers (PC) which provide tracking information , and an upgrade The Time-of-flight detector for the PHENIX West arm (TOF.W), which significantly enhance d the PHENIX PID capabilities. During the last grant period we have provided continuous hardware and software support for both subsystems . The TOF.W detector was successfully commissioned during Run 7 of RHIC. Expert shift coverage was provided exclusively by the Vanderbilt group (Huang, Valle, Love, Danchev, Belmont, and Velkovska). The detector has been in stable operation during Runs 7, Run 8, and Run 9 and it has performed with intrinsic timing resolution of σ ~ 70 ps which exceeds our design goal of σ ~ 100 ps . This performance was achieved with careful considerations of the operating conditions and with extensive calibrations. We have published a technical paper 11 which describes the read-out electronics for the multi-gap resistive plate chambers. A second paper, focusing on the detector operations and calibrations, is presently under internal review and will be submitted for publication before the end of this grant period. The excellent performance of TOF.W made many new physics analyses possible some of which are discussed in this proposal.

The PC subsystem is part of the baseline design for the PHENIX detector. The three layers of Pad Chamber provide three-dimensional space points for charged-particle paths traversing the central spectrometer arms. They form an essential part of the pattern recognition process and play a crucial role for rejecting background tracks at high transverse momentum. In addition to the run-time expert shift coverage (Issah, Roach , Greene ), in 2008 and 2009, we made several upgrades to the PC ’s calibration and monitoring software to improve the speed and robustness of the code. In 2009 we successfully performed the technically difficult task of replacing one of the PC1 modules.

2.2. ACTIVITIES IN THE CMS EXPERIMENT AT THE LHC

The Vanderbilt group formally joined CMS in 2006. During the last grant period we have been gradually building -up out our CMS effort. Presently we have 6 group members working partially or full time on CMS ( 3 faculty, 1 post-doc and 2 students). A new post-doc who is joining the group in November 2009, will work full time on CMS. Within the CMS heavy ion group we have

11 Nucl.Instrum.Meth.A596:430-433,2008

14

Page 15: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

assumed leadership role in 3 major areas. These areas are in offline computing, data quality monitoring, and the physics of correlations and flow. On behalf of the US-CMS-HI institutions Prof. Maguire presented a proposal to DOE on May 11, 2009 for locating a computing center at the Vanderbilt ACCRE facility that will perform raw data reconstruction and carry out data analyses. He also was in charge of organizing a comprehensive workshop for the benefit of CMS computing management to discuss the CMS-HI data processing plans. This workshop, held on September 12, 2009 at the end of a CMS Week meeting in Bologna , coincided with the release of the review report of the May 11 proposal to the DOE. The DOE review report encourages the US-CMS-HI institutions to submit an updated proposal by the end of 2009, which will put into service the required data reconstruction and analysis facility at Vanderbilt. Following the Bologna workshop CMS computing management is firmly committed to supporting such a proposal. In preparation for the computing proposal and the Bol o gna workshop, g raduate student Mr. Appelt has been working with ACCRE staff to help test our high-speed network connections as well as the reconstruction process for heavy ion events using the CMS software framework (CMSSW). The specific goal with reconstruction has been to demonstrate the ability of the Vanderbilt facility to generate and reconstruct large numbers of events, as well as to determine the processor time, memory footprint, and file size of the reconstructed events. During the summer of 2009, graduate student Mr . Snook has also joined in this effort working together with Mr. Appelt. . After only several months with the group, Mr Appelt has become well versed in the CMS software. He takes an active part in the CMS-HI software taskforce and pioneered porting of the CMS analysis software to reconstructed heavy ion events . He is currently responsible for the documentation of the heavy ion analysis framework. In the area of data quality monitoring, Prof. Velkovska spent a year-long sabbatical leave at CERN in the 2008-2009 academic year as part of her responsibility to develop the DQM software needed to ensure data quality during the HI running at the LHC. This sabbatical leave was jointly supported by Vanderbilt and by a research fellowship from the Alfred P. Sloan foundation. Prof. Velkovska took DQM shifts (both online and offline) during the cosmic ray data taking and subsequent calibration and full reconstruction in November 2008 and in March 2009. She also organized the HI group for a sustained effort in this area. Professor Velkovska is also the head of the Correlations and Flow Physics Interest Group in CMS-HI, which is expected to produce the first physics publications from the 2010 LHC HI data set. In May 2009 the physics readiness of the HI group was reviewed by a CMS appointed committee which involved both internal and external members . In preparation for th e review the correlations and flow physics interest group 12 prepared a plan for the first two physics results to be published by this group:

12 Five physics interest groups were formed within the HI group in February 2009: spectra, correlations and flow, high-p

T, di-leptons, forward physics. Each group is lead by one convener.

15

Page 16: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

E lliptic flow of inclusive charged hadrons as a function of p T and pseudo-rapidity

measured using reaction plane method, cumulant method and Lee-Yang zeros method from Pb+Pb collisions.

Di-hadron correlations for inclusive charged hadrons in pp and Pb+Pb collisions Using HYDJET simulations, the flow group has developed methods for reaction plane determination using several different detector subsystems: the Si tracker, the electromagnetic calorimeter EC AL , or the hadronic calorimeter HCAL. The reaction plane method is pursued by the groups in Moscow State University and the University of Kansas. The cumulant flow analysis is being developed in the Vanderbilt group. The di-hadron correlations analyses are also well under way and are being pursued at MIT and the University of Illinois at Chicago. Prof. Velkovska organizes regular bi-weekly meetings to provide a forum for discussions and to guide the progress in the analyses.

3. PROPOSED WORK FOR THE NEXT 3 YEARS

During the next three years the group will continue participating in both t in the PHENIX experiment at RHIC and at the CMS experiment at the LHC, splitting the personnel and/or effort roughly half –and –half. We propose a 50-50 split in effort between the two experiments . We believe that our work on PHENIX work is of continuing importance both to the collaboration and for our physics program, while exploring taking part in the exciting next stepsnew frontier in the exploration of high-temperature and energy density QCD matter to be found to be produced at the LHC brings new horizons into our researchpresents a logical and exciting future for our research program . Our proposed work at the LHC is completely integrated into the current CMS-HI Research Management Plan (RMP) which is being submitted to the DOE.

3.1. PHENIX [THIS PARAGRAPH HAS TO BE REFORMATTED TO REMOVE INDENT] ( 7 PAGES)

3.2. Our physics goals for the next three years with the PHENIX detector are to further our studies of isotropic flow and to push our measurements of identified-particle spectra and nuclear modification factors out to higher transverse momenta . We will do this for a number of particle species, including , K, p, and , taking full advantage of the TOF.W detector’s capabilities. . We will also use data from the planned energy scan to probe the onset of critical properties and to search for evidence for the critical point. In the sections below, we give detailed descriptions of our PHENIX physics plan.

16

Page 17: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Furthermore, Professor Velkovska is serving as convenor of the hard scattering physics working group and Dr. Huan g is convenor of the hadron physics working group; both terms will run until spring of 2011. We will continue to fully support the Pad Chamber and TOF-W subsystems, maintaining ha rdware, providing experts to ensure good operation during upcoming runs at RHIC, and updating calibration and analysis software as needed . Professors Velkovska and Greene will continue to serve on the PHENIX detector council in their roles as subsystem managers.

3.2.1. IDENTIFIED PARTICLES AT √SNN

= 200 GEV: . NEXT STEPS, MILESTONES, 3.2.2.

There are several ongoing analyses involving identified particles produced in √sNN

= 200 GeV Au+Au collisions. The anisotropic flow analysis (v

2 and v

4) described in Section 2.1.3 is

already completed and a paper preparation group has been formed. The personnel involved from the Vanderbilt group are Dr Huang, who is the main person performing the analysis, and Prof. Velkovska, who will help with the paper preparation. The goal is to submit the results for publication before the end of the current is grant period. We expect that some revisions may be needed in the course of the next grant period in response to referee reports. The milestone for this analysis is the publication of the paper.

The analysis of high-pT spectra and nuclear modifications factors for π±, K±, p and p (

Section 2.1.5) is the basis of the Ph.D. thesis of graduate student Ron Belmont. During the last grant period he has completed the spectra analysis for pions and protons from Au+Au collisions at √s

NN =

200 GeV using the TOF.W and the aerogel detectors. The absolutely normalized spectra were approved PHENIX Preliminary up to p

T=5 GeV/c. and presented in a parallel talk at Quark Matter

2009. Mr. Belmont Ron is presently completing a study of occupancy effects in the high multiplicity central events using simulated events embedded in real data. For the Preliminary data approval these corrections were estimated based on real data alone. The next step in this analysis involves extending the PID to higher p

T by including information from the ring imaging Cherenkov counter (RICH) in

the analysis. Above pT= ~4.5 GeV/c, pions traversing the RICH gas generate Cherenkov light and this

signal can be used in conjunction with the PID from aerogel and TOF.W for pion/kaon separation. After the Au+Au analysis is completed, Mr. Belmont Ron will apply his analysis method to pp and d+Au data obtained in Run 8 and Run 9 . With these data in hand he will be able to construct the nuclear modification factors R

AA and R

dA and thus study the hot and cold nuclear matter effects on π±,

K±, p and p production , and the hadronization mechanisms in the produced system. There are

17

Page 18: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

several milestones to be achieved before the Ph.D. thesis of Mr Belmont is completed. These are listed in Table 2 along with their completion target date.milestones target date personnelFinalizing the occupancy corrections in Au+Au collisions

end of the current grant period Belmont, Velkovska

Extending PID to pT> 5 GeV/c

including TOF.W, aerogel and RICH

March 2010 Belmont, Huang

Obtaining raw spectra of K± producing simulation files for efficiency corrections, obtaining fully corrected charged kaon spectra

May 2010 Belmont

Calibrations needed for Run8 data analysis

July 2010 Belmont, Huang

Analysis of Run 8 dAu and pp data, R

AA and R

dA

September 2010 Belmont

Studies to determine systematic uncertainties in Au+Au , dAu and pp spectra

November 2010 Belmont, Velkovska

Submission of results for publication

January 2011 Belmont, Velkovska

Preparation of Ph.D. thesis and defense

May 2011 Belmont

Table 2 High-pT spectra and R

AA, R

dA for π±, K±, p and p. Milestones en route to publication, target date and

personnel involved.

The analysis of the Λ and Λ spectra and elliptic flow out to high p T are the Ph.D. thesis

topic for graduate student Dillon Roach. He has already made significant progress toward obtaining the transverse momentum spectra in Au+Au collisions at √s NN

= 200 GeV (see Section 2.1.5 ) . An occupancy effect that was hindering the Λ and Λ identification at high p T

( above 6 GeV/c) has been investigated and solved using the HYDJET Monte Carlo event generator and the full simulation of the PHENIX detector . The next steps for this analysis include obtaining efficiency corrections as a function of centrality and fully corrected spectra. In order to construct the nuclear

18

Page 19: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

modification factor, R AA out to high p T

, we will analyze the pp collision data from Run 8 and Run 9. . We will also study elliptic flow in the Au+Au collision system as a function of centrality. The study of the Λ baryons is of particular interest for several reasons :

S ince we reconstruct the Λ baryons through the decay Λ → p + π, the PID for Λ exceeds in momentum the proton PID. Thus , we will be able to go beyond the recombination region and test fragmentation in the medium and jet-flavor dependence of suppression. .

In view of the recent findings of anomalous suppression of φ- mesons, it is important to test the R AA

of baryons containing strangeness out to the high p T.

In the e lliptic flow data f or i dentified pions and protons ,we discovered that the quark number scaling of v 2

break s for transverse kinetic energy per quark > 1 GeV. It is likewise important to test the universality of this result using strange baryons.

The milestones for the Ph.D thesis of Dillon Roach are listed in Table 3 below .

milestones target date personnel Extending PID to p T

> 6 GeV/c for central Au+Au collisions

end of the current grant period Roach , Huang, Velkovska

Producing simulations and obtaining efficiency corrections

March 2010 Roach , Maguire

Calibrations needed for Run8 and Run 9 pp data analysis

May 2010 Roach, Belmont , Huang

Analysis of Run 8 and Run 9 pp data, R AA

of Λ and Λ

July 2010 Roach

Analysis of v2 of Λ and Λ in Au+Au collisions at √s NN

= 200 GeV

September 2010 Roach, Huang

Studies to determine systematic uncertainties in

November 2010 Roach

Submission of results for publication

January 2011 Roach, Greene

Preparation of Ph.D. thesis and defense

May 2011 Roach

Table 3 High-p T spectra , R AA

, and v 2 for Λ and Λ . Milestones en route to publication, target date and personnel

involved.

PERSONNEL.

19

Page 20: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

3.2.3. isENERGY SCAN : PI , K, P SPECTRA AND FLOW

A major component of the PHENIX physics program in the upcoming Run 10 is the low energy scan of Au+Au collisions in the s earch for the onset of perfect liquid properties via opacity and flow measurements, and possible evidence of the QCD critical end point at modest baryochemical potential . Identi fi ed particle spectra are useful tools to characterize the hadron gas phase of the collision. The measurement of identi fi ed particle ratios, including K /π and proton-to ant i proton ratios are necessary to measure the location of the system on the QCD phase diagram. In addition, the measurement of proton-to-antiproton ratios may serve as a direct signal for the presence of the critical point if the QCD critical point serves as an attractor of hydrodynamic trajectories in the μ b

- T plane describing the expansion of the hot matter. The TOF.W detector along with a new start- time detector with acceptance optimized for low energy energy collisions will provide measurements of identi fi ed particle spectra and ratios at collision energies below 17 GeV. From the hadron ratios and spectra, the freeze-out temperature and radial fl ow of the hadrons, which re fl ect the expansion velocity , can be extracted . PHENIX proposes to record 350 million, 50 million and 25 million minimum bias Au+Au collisions at √ s NN

= 62. . 4, 39, and 27 GeV, respectively .

Elliptic flow of identified hadrons will be measured to sufficiently high p T to identify the point

where quark number scaling breaks at the two highest energies. The p T dependence of inclusive

hadron elliptic fl ow will be measured at all three energies with sufficient precision to constrain to viscosity to entropy ratio by comparison to calculations with viscous hydrodynamics. Equally importantly, the same data sets will be used to search for possible evidence of the QCD critical point in the lower part of the μ b

range predicted by various lattice calculations. This search will utilize v 2,

fl uctuations, HBT correlations, and identi fi ed hadron yield observables.

We propose to analyze the spectra and v 2.

People : Shengli + possibly a new student. Need to write a table with milestones.

3.3. CM S S ( 10 PAGES)

3.4. SERVICE: COMPUTING , DQM

3.4.1. PHYSICS

At RHIC, global characteristics of particle production have been found to exhibit many surprisingly simple scaling behaviors. Extrapolation to LHC energies ( = 4 TeV) suggests that the heavy ion

20

Page 21: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

program at CERN has significant potential for major discoveries. It is possible expected that in central Pb + Pb collisions the charged particle multiplicity at mid-rapidity will be of the order dN ch

/dη |

y=0 ≈ 2000 and the corresponding initial energy density ε 0

~ 100 GeV/fm 3 (estimated at initial time τ 0 ≈

0.3 fm/c). A fully equilibrated QGP at these energy densities will have an initial temperature of T 0 ≈

(ε 0 /12) 1/4 ≈ 500 MeV ~ 3 T c

which is a factor of two higher than the initial temperature achieved in Au + Au collisions with at = 200 GeV at RHIC. As a result, the QGP formed at the LHC is expected to be longer lived and to have a bigger volume. The theoretical predictions concerning the coupling in the QGP system vary dramatically from a system that is very similar to RHIC to a system which reaches the weakly coupled QGP phase. Clearly, it is up to experiment to discover the character and the properties of QCD matter in this new energy regime. An experimental advantage is that the higher collision energy will result in a significantly enhanced cross-sections for hard-scattering processes giving a much larger yield of hard probes with high mass and/or transverse momentum. This advantage coupled with a detector that has excellent capabilities for measuring hard-probe signals as well as low pT global observables with very large acceptance provide opportunities for a rich experimental program.

The physics topics at CMS of particular interest to the Vanderbilt group are :

Global observables and more specifically, azimuthal anisotropy in particle emission heavy quark production and flow jet quenching as revealed by jet-shapes in heavy ion collisions in comparison to pp collisions

The global measurements are of interest in view of the findings at RHIC of rapid approach to thermal equilibrium in the produced QGP matter and the small viscosity/entropy ratio. Theoretical efforts to understand how equilibration is achieved and to quantify the connection of medium properties like the viscosity to the experimental observables are underway. Measurements at the LHC will provide crucial new information to the existing studies through the measurement of flow at significantly higher initial energy densities. We propose to study azimuthal anisotropy in particle production using several different methods: reaction plane, two-particle correlations, multi-particle correlations (cumulant) method, and to extract the v 2

and v 4 Fourier components in the particle

azimuthal distributions. Scaling properties of v 2 and the ratio v 4

/(v 2) 2 are sensitive to the

hydrodynamic equation of state and the properties of the medium. The Vanderbilt group has considerable prior expertise in anisotropic flow measurements using RHIC data, as has been extensively described in the previous sections of this proposal.

Software implementation of several methods of elliptic flow measurement has already been developed in PHENIX and applied to RHIC data by several of our group members. This software can be ported to CMS with minimal effort. Since these are the first measurements that will be available from the LHC, we view the task to prepare for these analyses as one of our early contributions to

21

Page 22: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

CMS. One or both of our second year graduate students whom we project to work in CMS could do a dissertation based on the global measurements accessible in the 2010 data.

Tagged jets, jet quenching, and quarkonium production are topics in which CMS has unique strength. Due to the required luminosity these measurements will become possible later than the global measurements, at best in the third year of the next grant cycle. Dr. Pelin Kurt, who has just accepted our offer to be our third research associate is primarily interested in the subject of jets and jet quenching. Based on extensive existing simulations and reviews by both the DOE and internal CMS physics oversight committees, this physics topic is expected to become a signature measurement of the CMS-HI research program

3.4.1.1. FLOW : INCLUSIVE CHARGED HADRONS ( VELKOVSKA, DR. ISSAH, AND A NEW STUDENT)

3.4.1.2. Future work Michael: Flow analysis in CMS

3.4.1.3.

Elliptic flow measurements have been critical in the RHIC program. The se y have served to demonstrate indicate the hydrodynamic nature of the matter formed in Au+Au collisions as well as to uncover the surprisingly its low viscosity of the matter. .. At the LHC, non-flow contributions, i.e . correlations not due to the collective motion of the matter, are expected to be important and as a result, reliable flow measurements will require methods that can help disentangle the flow correlations from those arising from direct sources like jets and resonance decays. These Such methods, such as like the cumulant and Lee-Yang zeros methods, have been developed and applied at RHIC. The cumulant method is based on multi-particle correlations and extracts v 2

to different orders whereas the Lee- Yang zeros method measures v 2

from a large number of particle correlations and is the least sensitive to non-flow effects. Dr. Issah I ha s ve been working on applying the cumulant method in CMS and h as ave also begun started preliminary work on the Lee-Yang zeros method. The cumulant method was tested on events from the HYDJET event generator within the CMS computing framework. A sample of results is shown below . [We have to get these figures numbered properly in Word. There is not enough text for this many figures.]

Figure (a) below shows the variation of v 2 with p T

for an impact parameter range 4-6 fm obtained from the standard reaction plane method and cumulant methods.

22

Page 23: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Fig. (b) shows the pT dependence of v2 for a peripheral collision (b = 8-10fm)

23

Page 24: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

The figure below shows the impact parameter (or centrality) dependence of v 2 for the second (v 2

2), fourth (v 2

4) and sixth (v 26) order cumulants

We have also studied the pseudorapidity dependence of v 2 for two different cumulant orders as

shown in the figure below. There is a difference between the two orders that could be due to non-flow.

24

Page 25: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

3.4.1.4. PI0 SPECTRA, RAA (MAGUIRE , DR. KURT, AND MR. APPELT)

Status Report for Grant Proposal E Appelt

In January of 2009 I began work with t he heavy ion physics analysis group of CMS collaboration. My primary responsibility has been to with the propos ed computing center at Vanderbilt. I have been working with the staff at Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE) to help test our high-speed network connections as well as the reconstruction process for heavy ion events using the CMS software framework (CMSSW). My specific goal with reconstruction has been to demonstrat e the ability of the Vanderbilt facility to generate and reconstruct large numbers of events , as well as to determine the processor time, memory footprint, and file size of the reconstructed event s. In the process of completing this work, I ha ve worked closely with the heavy ion software task force in debugging the tracker reconstruction algorithms applied to central heavy ion events.

I h ave also been involved in the porting of the CMS analysis software to reconstructed heavy ion events, and am currently responsible for the documentation of the analysis framework.

At this time we are I plan on exploring two different options types for a physics of analysis to be done by graduate student Appelt during the scope of this proposal. , T t he final choice between these two options of which will depend on the amount of data collected in the first two runs (2010 and 2012) and the trigger configurations paths used for the data.

One analysis option is the determination of neutral pion spectra in Pb+Pb lead-lead collisions at LHC energies. In proton-proton collisions, pion spectra may be determined using the preshower detector of the high pseudorapidity endcap electromagnetic calorimeter (ECAL) by generating a mass peak from

25

Page 26: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

paired photons. In heavy ion collisions, the particle multiplicity at the pseudorapidity range of the endcap ECAL is expected to be too high to accurately pair photons from neutral pion decay. However, it may be possible to use the lower pseudorapidity barrel ECAL to detect photon pairs from neutral pion decays in these events. Using his already developed expertise with CMSSW analysis tools (see section 3.2.3.1 below) Mr. Appelt I have simulated neutral pions from a particle gun with a p T

transverse momenta of up to 11 GeV/c 2 and was able to detect ed a mass peak using reconstructed hits from the barrel ECAL (figure 1). [ This figure is mis-located at the end of the proposal , and we have to renumber the figure in the text.]

The other analysis option will be the determination of flow effects for open charm mesons, by observing semileptonic decays to single muons and hadrons. Detection and identification of the D + meson should be possible by using the precision vertexing capability of the CMS pixel detector, along with the excellent muon coverage. This can be accomplished by triggering events on a single muon, and then making a cut on displaced vertices.

26

Page 27: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

At RHIC, global characteristics of particle production have been found to exhibit many surprisingly simple scaling behaviors. Extrapolation to LHC energies ( = 5.5 TeV) suggests that the heavy ion program at CERN has significant potential for major discoveries. It is expected that in central Pb + Pb collisions the charged particle multiplicity at mid-rapidity will be of the order dN

ch/dη

|y=0 ≈ 2000

and the corresponding initial energy density ε0 ~ 100 GeV/fm3 (estimated at initial time τ

0 ≈ 0.3 fm/c).

A fully equilibrated QGP at these energy densities will have an initial temperature of T0 ≈ (ε

0 /12)1/4 ≈

500 MeV ~ 3 Tc which is a factor of two higher than the initial temperature achieved in Au + Au

collisions at = 200 GeV at RHIC. As a result, the QGP formed at the LHC is expected to be longer lived and to have a bigger volume.

The theoretical predictions concerning the coupling in the QGP system vary dramatically from a system that is very similar to RHIC to a system which reaches the weakly coupled QGP phase. Clearly, it is up to experiment to discover the character and the properties of QCD matter in this new energy regime. An experimental advantage is that the higher collision energy will result in a significantly enhanced cross-sections for hard-scattering processes giving a much larger yield of hard probes with high mass and/or transverse momentum. This advantage coupled with a detector that has excellent capabilities for measuring hard-probe signals as well as low p

T global observables with very

large acceptance provide opportunities for a rich experimental program.

3.4.2. PHYSIC INTERESTS OF THE VANDERBILT GROUP AT CMS The physics topics at CMS of interest to the Vanderbilt group are :

global observables and more specifically, azimuthal anisotropy in particle emission heavy tagged jets and jet quenching quarkonium production and flow jet quenching as revealed by jet-shapes in heavy ion collisions in comparison to pp collisions

The global measurements are of interest in view of the findings at RHIC of rapid approach to thermal equilibrium in the produced QGP matter and the small viscosity/entropy ratio. Theoretical efforts to understand how equilibration is achieved and to quantify the connection of medium properties like the viscosity to the experimental observables are underway. Measurements at the LHC will provide crucial new information to the existing studies through the measurement of flow at significantly higher initial energy densities. We propose to study azimuthal anisotropy in particle production using several different methods: reaction plane, two-particle correlations, multi-particle correlations (cumulant) method, and to extract the v

2 and v

4 Fourier components in the particle

azimuthal distributions. Scaling properties of v2 and the ratio v

4/(v

2)2 are sensitive to the

hydrodynamic equation of state and the properties of the medium. The Vanderbilt group has considerable expertise in anisotropic flow measurements using RHIC data, as has been extensively described in the previous sections of this proposal.

27

Page 28: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Software implementation of several methods of elliptic flow measurement has already been developed in PHENIX and applied to RHIC data by several of our group members. This software can be ported to CMS with minimal effort. Since these are the first measurements that will be available from the LHC, we view the task to prepare for these analyses as one of our early contributions to CMS. One or both of our second year graduate students whom we project to work in CMS could do a dissertation based on the global measurements accessible in the 2010 data.

Tagged jets , jet quenching, and quarkonium production are topics in which CMS has unique strength. Due to the required luminosity these measurements will become possible later than the global measurements, likely in the third year of the next grant cycle. The person to whom we have a pending offer as our third research associate is primarily interested in the subject of jets and jet quenching.

Vanderbilt strength and unique contribution to CMS:

The Vanderbilt group has been a member of the PHENIX collaboration since its inception in 1992. It has diverse expertise and has been a major contributor to the RHIC program in hardware and software development, computing, physics analyses and publications. The physics topics in which the group has considerable expertise are:

Global observables: measurements of multiplicity, particle yields and ratios, radial flow (using identified particle spectra), elliptic flow using reaction plane data, two-particle correlations and cumulant methods

Nuclear modifications to particle production at intermediate and high-pT. These studies are

performed using identified baryons and mesons including π, K, p, φ, Λ.

Studies of in-medium modifications of vector mesons The group has been particularly strong in physics with global/hadron observables. Since 2002, one out of 4 conveners leading the PHENIX global/hadron physics working group has been a member of the Vanderbilt group (Velkovska, Chujo, Greene). In this role, we have supervised the analysis and publications of all PHENIX efforts in this area in addition to the direct contribution of analyses performed within the group.

In addition to physics expertise, the Vanderbilt group has unique strength and resources in computing and simulations which are vital for the success of the CMS HI program. As part of the RMP’s strategy for computing resources in CMS-HI, the Vanderbilt group proposes to use the ACCRE

28

Page 29: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

computer farm for doing both real data reconstruction and simulation data production. This effort is described in greater detail in section .

3.4.3. SERVICE TASKS FOR CMS

In addition to the physics goals described above using the CMS detector, we propose to undertake two service tasks for the CMS collaboration. First, we will prepare the monitoring software to be used to ensure data quality in real time. Second, we will develop the primary data production and analysis facility for CMS-HI, to be located at and supported in part by Vanderbilt University. Both of these projects are described below.

In 2009, two members of the Vanderbilt group (Maguire and Velkovska) were granted MoA status by the DOE to start in January 2010. Such status is required for Ph.D persons in order to be granted authorship on the collaborations research publications. This status accords with the prominent roles taken by the Vanderbilt group for CMS-HI in data quality monitoring and in offline computing. In addition, two of our graduate students (Appelt let and Snook) have achieved recognition in CMS-HI for the contributions to the software development and to a better understanding of the overall computing requirements numbers.

3.4.4. DATA QUALITY MONITORING

Data Quality Monitoring (DQM) appears in various places across the experiment. The goal is to monitor the data in real time, to verify that data reconstruction has been processed correctly and finally to prepare tagged data sets for specific physics analyses. The DQM group is responsible for constructing construct the end-to-end connected chain throughout the experiment where DQM is done, to coordinate the technical implementation, including software architecture, service deployment, and procedures. The group is led by two co-convenors: I. Segoni and A. Meyer. There is one DQM expert from each of the sub-detector groups, and there are also representatives from the High-level Trigger (HTL) group, offline group, and various physics groups. Since September 2008, Prof. Velkovska has taken responsibility for DQM on behalf of the Heavy Ion group. This important task had not yet been addressed in the HI group and Prof. Velkovska’s timely involvement fills a gap in the HI preparation for data taking.

A brief description of the DQM architecture, the status of each component for pp data taking and areas where testing and new development are needed for heavy ion data is given below. The DQM software framework provides tools for the creation, transport and manipulation of monitoring elements (ME), i.e. histograms. The framework is based on CMS software (CMSSW) and ROOT . “Source” modules are used to create and fill monitoring elements with event level quantities. The monitoring elements are periodically fed into “client” modules for further processing, e.g. fitting, determination of means, etc. A central function of the client modules is to perform quality

29

Page 30: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

tests on the incoming monitoring elements. The quality test algorithms as well as facilities to attach the test results to the ME are provided by the DQM framework. A requirement to the DQM framework is to be operational both in an online and offline environment, and to work on both input from file as well as on live online data. The DQM software provides a framework for monitoring detectors, triggers and physics objects. It also communicates with various databases and tags data suitable for specific analyses. A schematic view on how this is accomplished is shown in Figure 10.

Figure 10 Schematic view of the CMS Data Quality Monitoring architecture, for online and offline data monitoring and certification. The development status of each of the components is also shown.

A small subset of the data (about 5-10 Hz rate) is sampled in the high-level filter farm and sent to DQM. Detector level monitoring is done at this stage. The “sources” create and fill monitoring elements (tree-like directories with histograms, profiles, scalars) and present them on demand to the client modules subscribed to receive the information via a Monitoring Daemon. These are then examined by the client modules and visual information several level deep is presented to the online DQM shift person. As part of the online monitoring, the DQM system also collects information from the sub-detector data-acquisition systems (DAQ) and detector control systems (DCS). Various alarm levels and flags can be programmed and used in the online data certification step. The ultimate

30

Page 31: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

responsibility for the online DQM shift is to monitor the integrity of the data and to provide information on the status of the particular run to the Run Registry data base, as well as to communicate with detector experts. At present, the architecture of the online DQM system is mostly complete. Prof. Velkovska’s experience in 2008 at the LHC covering shifts during cosmic ray data taking was that the infrastructure is easy to use; however the information about the detector status is still incomplete. There were numerous false alarms as well as situations where the detector was shown as 100% operational at the highest level histograms, but it appeared partially nonfunctional at the next level. This situation is normal at the start-up of such a complex detector. However, it is clear that even if the online DQM system is completely debugged and functional by the first HI run, we need to prepare the software now using simulations to ensure successful HI data taking from the beginning. Prof. Velkovska has started work in this direction with the help of three students: one from MIT, USA and two from Turkey. However, this is only possible because the students are at CERN and Prof. Velkovska is spending her sabbatical leave there. We would like to get one of our own students and a post-doc involved in these activities, upon Prof. Velkovska’s return to Vanderbilt in April, 2009.

The next step of monitoring happens at the T0 center, after the data have been taken and full reconstruction is started. At this stage the full statistics will be available and we can do more detailed monitoring by performing averaging, fitting, and making comparisons to sample data. This is done for every trigger path. Presently, the software for pp analysis is being deployed. Monitoring and certification of the outputs of the HI high-level trigger are presently being designed. Prof. Velkovska has expressed interest in working on this project together with Dr. Christoph Roland who is responsible for the HLT software.

Finally, the data are monitored offline just after full reconstruction. The monitoring here is performed using specific “physics objects”. The tools for DQM of physics objects are provided by the physics groups. In Figure 10, the offline step in DQM is shown to be performed at the CERN computing facility (CAF) for the pp data. This step is likely to be performed at the ACCRE facility. Naturally, we expect that the Vanderbilt group will take a major responsibility in offline DQM as well, as described in the following section of this proposal. The physics object monitoring software is the least developed part of DQM. It is presently being designed for the different physics objects used for pp data. Although there is some overlap in the tasks for HI and pp, here we face a significant development task.

The timeline envisioned for developing the DQM for HI is as follows:

31

Page 32: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

During 2009 we must first develop the monitoring elements needed for HI data taking. Most of the detector ME already exists. To first order, this may be a matter of just testing them with HI simulated data and adjusting some histograms. However, alarm levels and flag conditions are obviously different and care should be taken that we can successfully take the HI data. This is our highest priority. This task includes the HLT validation at the Tier 0 as it is an integral part of the data taking. Monitoring of physics objects is important, but it can be done at a later time and will be started after the online and T0 monitoring are largely completed. This may be as early as the summer of 2009 or as late the end of 2009. We envision that Prof. Velkovska will remain in charge of the HI DQM system after the initial software development efforts.

3.4.4.1. CMS COMPUTE CENTER

Vanderbilt University has been selected by the U.S. CMS-HI institutions to be the site of the primary data production and analysis compute center for the CMS-HI research program. As has been mentioned previously, a proposal from the set of CMS-HI institutions in the US for funding this center has been submitted to and reviewed by the Department of Energy. An updated proposal to the DOE based on responding to the reviewer comments is due by the end of calendar 2009. As a measure of its strong support of this proposal Vanderbilt University has spent $1M at the end of 2008 committed to upgrade ing its wide area network service connectivity to its Internet hub in Atlanta to accommodate the 10 Gigabits/second specification necessary required for the operation of to meet the needs of a the CMS-HI compute center. In further support of the computing proposal to the DOE t The University has also committed to spending $1.1M in staffing support dedicated to the CMS-HI compute center at the ACCRE facility. A companion compute center for doing simulation data generation and production in support of CMS-HI data analyses is proposed to be located at the existing MIT CMS Tier2 center. That MIT facility has been meeting the simulation needs associated with the start-up of the CMS-HI research program.

The Vanderbilt site is proposed to contain the tape archive which will serve as a repository for three major data components: 1) all CMS-HI raw data transported from the LHC, 2) for the data production and the analysis output generated at the Vanderbilt center, and 3) for the simulation data production from the MIT center. The Vanderbilt and MIT centers will serve as compute hosts for analyses jobs submitted remotely from any of the CMS-HI institutions. This processing will be done within the framework of the CMS software system which makes extensive use of Grid middleware products to facilitate remote site computing. Both the Vanderbilt and the MIT compute centers will

32

Page 33: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

be integrated into the network of CMS T0, T1, and T2 centers which have been developed to provide computing for the CMS-HEP research program worldwide.

The proposed primary compute center for CMS-HI will be housed in the present ACCRE computing facility at Vanderbilt. The ACCRE facility currently has some 1500 CPUs (which will grow to 2700 CPUs by mid-2010), 50 TBytes of disk space, and several hundred TBytes of tape archived under active management. The CMS-HI compute center implementation at Vanderbilt is expected to take place over five years from 2010 to 2014. At the end of the implementation phase there will be approximately 2500 CPU cores, 600 TBytes of disk space, and 2.0 PBytes of tape storage dedicated for the CMS-HI computing needs at ACCRE.

The RHI group at Vanderbilt already possesses extensive experience in data production and analysis for the PHENIX experiment. In the 2006 pp run at RHIC, data acquired by the PHENIX detector were transported over the Grid network from BNL to the ACCRE facility and then processed in near real time. The results were shipped back for analysis at the RHIC Computing Facility. The same work was repeated on an even larger scale in 2007 for the Au + Au run at RHIC. We are expecting that the first data run at the LHC, now originally scheduled for November 2010 fall 2009, will produce a data set comparable in size to that processed by our group for PHENIX in 2007. Hence we are confident that we will be able to manage this task for the CMS-HI collaboration.

In addition to our previous related hardware and data processing experience with PHENIX, our group members have rapidly become proficient in the applications of CMS software. In January of 2009 graduate student Appelt first joined the group and was assigned to our expanding effort in the CMS heavy ion research program. His initial primary responsibility was to assist with the proposal to develop a CMS-HI computing center at Vanderbilt. He began working directly with the technical staff at Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE) to help test our high-speed network connections as well as the reconstruction process for heavy ion events using the CMS software framework (CMSSW). For the purposes of the Bologna computing workshop he had the specific urgent goal of demonstrating the capability of the ACCRE facility to generate and reconstruct large numbers of simulated HI collision events. In addition, he was asked to determine the processor time, memory footprint, and file size of the reconstructed events using the latest software production release (CMSSW 3_1_2). These crucial results became available in time to present to the CMS computing management at the 2009 CMS Week Bologna workshop, and at a subsequent CMS Computing Resources Board meeting, with the effect of assuring the computing management that sufficient resources would be available to process the anticipated amounts of CMS- HI data in the next several years. In the process of completing this work, Mr. Appelt has worked

33

Page 34: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

closely with the members of the heavy ion software task force in debugging the tracker reconstruction algorithms applied to central heavy ion events. Mr Appelt has also been involved in the porting of the CMS analysis software to reconstructed heavy ion events, and is currently responsible for the documentation of the analysis framework. He has also been the person who has managed the training of our other group personnel, Dr. Issah and Mr. Snook in particular, in the use of the complex CMSSW framework.

Our RHI group’s service work in CMS is also the beneficiary of a good association with our Department’s local HEP group who joined CMS one year before we did. The HEP group is responsible for the support of the CMSSW software at the ACCRE center. Following Professor Maguire’s request, the HEP group and the ACCRE technical staff have worked with Mr. Appelt in demonstrating the capability of the Vanderbilt computing center to handle the very demanding file transfer rates which will eventually be needed for the CMS-HI computing center. Below in Figure 11 we show recent examples of the high throughput capacity at ACCRE which has been made possible by Vanderbilt University ’s 2008 investment in a 10 Gbps external network link. In the course of a two day period we were able to ingest 40 TB of data from four different source points in CMS. Since the ultimate goal in three years’ time is to be able to ingest 15 TB/day , this accomplish ment in 2009 proves that we are well on our way to meeting that milestone.

34

Page 35: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Figure 11: Plots showing the receipt of 40 TB of data using the standard CMS PhEDEx transfer software over the course of two days from four different T1 or T2 sources. The lower right plots show the persistently high quality of the transfer success during these two days.

3.5. OTHER PHYSICS ACTIVITIES

During the last year Prof Velkovska has also taken part in the following physics projects in PHENIX

Chaired the internal review committee for the paper “ Onset of pi-zero suppression studied in Cu+Cu collisions at sqrt(s_NN) = 22.4, 62.4, and 200 GeV” which has been published in Phys. Rev. Lett.101, 162301 (2008) , 2008-10-15

Chaired a Task force on identified hadron measurements in PHENIX. The goal of the task force was to prepare a year-by-year comparison for all identified hadron analyses (published and preliminary). The goal was to identify possible discrepancies in the vast amount of data analyzed by the collaboration.

Professor Greene has participated in the following physics projects in PHENIX:

35

Page 36: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Served as convenor for the global and hadron physics working group for PHENIX

Served on the paper preparation group for a paper on two particle interferometry for charged kaons in Au+Au collisions at 200 GeV

[I can put in some figures showing recent network tests at Vanderbilt which have us storing 40 TB of data in just two days, i.e. exceeding the 15 TB/day that we would need in 2012 and much faster than what we will need in 2010.]

4. PROPOSED WORK FOR THE NEXT 3 YEARS

4.1. PHENIX ( 7 PAGES)

4.1.1. IDENTIFIED PARTICLES RAA , RDA: PI, K, P,LAMBDA

Future Work R. Belmont

For my future work, I intend to continue along my current path. I will work towards more results on the Au+Au spectra and ratios, extending the transverse momentum reach, including more centrality classes, and including kaons, which were excluded for the preliminary study due to technical challenges and time constraints. I will analyze p+p data from Run 8 and Run 9, extracting the same spectra and ratio measurements for this system to facilitate a comparison to ascertain the nuclear modification, especially R

AA. The nuclear modification factor R

AA is the benchmark measurement for

jet modification in the medium, and has also proved useful in studying hadronization mechanisms, such as parton recombination, in the intermediate transverse momentum regime. In addition, I will also analyze the d+Au data set from Run 8 to study the cold nuclear matter effects on hadron production, thus helping to separate initial state effects from medium induced effects. For example, the measurement R

dA gives important information about the cold nuclear matter content of R

AA and is

thusly an important supplementary measurement.

Future work: Dillon

I plan to continue to support the pad chambers in whatever manner I am able. The challenges that arise are often similar to those encountered in the past, so I am well prepared now to handle them, but for those challenges that are not familiar I will continue to try to learn on the go and adapt to

36

Page 37: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

whatever is needed. For example, an entire section of the pad chamber is going to be taken down and replaced in the coming month and I will be there to learn what I can from the process and to help out where I am able.

I also plan to take the lambda baryon analysis much further. After diagnosing the extra features that are distorting our highest ranges in transverse momentum I will be well situated to make final corrections on the lambda spectra for run 7 and expect to have results to at least 8 GeV/c2 in pT.

Following this spectra work for the run 7 specific data I will have the tools at hand for processing the run 8 and run 9 proton-proton data for comparison to the run 7 Au-Au collisions. This will allow me to produce Raa for the lambda and anti-lambda to much greater ranges in pT than were previously possible. As well, I plan to further augment the current analysis to include v2 flow studies for the lambda and anti-lambda in runs 7, 8, and 9.

Taken together these results will be significant tools in the study of strangeness enhancement, hadronization, and the particulars of flow. With greater statistics and better timing resolution I expect to show results to not only higher ranges in pT, but with finer pT binning and with smaller errors than were previously possible. Taking the example of hadronization models, in many cases the differences between models fall between available data bins or are smaller than the available errors can resolve. This work should be able to address both of these deficiencies, as well as provide a longer range in pT for comparison to models.

4.1.2. ENERGY SCAN : PI,K,P SPECTRA AND FLOW

4.2. CMS ( 10 PAGES)

4.2.1. SERVICE: COMPUTING , DQM

PHYSICS

4.2.1.1. FLOW : INCLUSIVE CHARGED HADRONS

Future work Michael: Flow analysis in CMS

Elliptic flow measurements have been critical in the RHIC program. They have served to indicate the hydrodynamic nature of the matter formed in Au+Au collisions as well as its low viscosity.. At the LHC, non-flow contributions, i.e correlations not due to the collective motion of the matter, are expected to be important and as a result, reliable flow measurements will require methods that can help disentangle the flow correlations from those arising from direct sources like jets and resonance decays. Such methods, like the cumulant and Lee-Yang zeros methods, have been developed and

37

Page 38: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

applied at RHIC. The cumulant method is based on multi-particle correlations and extracts v2 to different orders whereas the Lee-Yang zeros method measures v

2 from a large number of particle

correlations and is the least sensitive to non-flow effects. I have been working on applying the cumulant method in CMS and have also started preliminary work on the Lee-Yang zeros method. The cumulant method was tested on events from the HYDJET event generator within the CMS computing framework. A sample of results is shown below.

Figure (a) below shows the variation of v2 with p

T for an impact parameter range 4-6 fm obtained

from the standard reaction plane method and cumulant methods.

Fig. (b) shows the pT dependence of v2 for a peripheral collision (b = 8-10fm)

38

Page 39: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

The figure below shows the impact parameter (or centrality) dependence of v2 for the second (v22),

fourth (v24) and sixth (v

26) order cumulants

We have also studied the pseudorapidity dependence of v2 for two different cumulant orders as

shown in the figure below. There is a difference between the two orders that could be due to non-flow.

39

Page 40: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

4.2.1.2. PI0 SPECTRA, RAA

Status Report for Grant Proposal E Appelt

In January of 2009 I began work with the heavy ion physics analysis group of CMS collaboration. My primary responsibility has been to with the proposed computing center at Vanderbilt. I have been working with the staff at Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE) to help test our high-speed network connections as well as the reconstruction process for heavy ion events using the CMS software framework (CMSSW). My specific goal with reconstruction has been to demonstrate the ability of the Vanderbilt facility to generate and reconstruct

40

Page 41: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

large numbers of events, as well as to determine the processor time, memory footprint, and file size of the reconstructed events. In the process of completing this work, I have worked closely with the heavy ion software task force in debugging the tracker reconstruction algorithms applied to central heavy ion events.

I have also been involved in the porting of the CMS analysis software to reconstructed heavy ion events, and am currently responsible for the documentation of the analysis framework.

I plan on exploring two different types of analysis, the final choice of which will depend on the amount of data collected in the first two runs and the trigger paths used for the data.

One analysis is the determination of neutral pion spectra in lead-lead collisions at LHC energies. In proton-proton collisions, pion spectra may be determined using the preshower detector of the high pseudorapidity endcap electromagnetic calorimeter (ECAL) by generating a mass peak from paired photons. In heavy ion collisions, the particle multiplicity at the pseudorapidity range of the endcap ECAL is expected to be too high to accurately pair photons from neutral pion decay. However, it may be possible to use the lower pseudorapidity barrel ECAL to detect photon pairs from neutral pion decays in these events. I have simulated neutral pions from a particle gun with transverse momenta of up to 11 GeV/c2 and detected a mass peak using reconstructed hits from the barrel ECAL (figure 1).

The other analysis will be the determination of flow effects for open charm mesons, by observing semileptonic decays to single muons and hadrons. Detection and identification of the D+ meson should be possible by using the precision vertexing capability of the CMS pixel detector, along with the excellent muon coverage. This can be accomplished by triggering events on a single muon, and then making a cut on displaced vertices.

41

Page 42: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

Figure 1: Mass spectrum produced from simulated neutral pions with a transverse momentum of 11 GeV/c by combining 5x5 “clusters” of energy deposits in ECAL crystals.

4.2.1.3. D, B FLOW , RAA

5. MILESTONES AND PERSONNEL RESPONSIBILITIES ( 2.5 PAGES)

5.1. MILESTONES

5.2. IMPACT ( NSAC LONG TERM PLAN , ETC)

5.3. PERSONNEL RESPONSIBILITIES

5.4. TABLE OF STUDENTS

6. BUDGET JUSTIFICATION ( 0.5 PAGE)

42

Page 43: DOE/ER/40712 - Vanderbilt Universitymaguirc/DOE/DOEgrant2009_C…  · Web viewA series of software and network test were carried out to assure that we are adequately prepared for

7. RECENT PUBLICATIONS AND CONFERENCE ORGANIZATION

7.1. CONFERENCE ORGANIZATION

QM09, HQ 2008

7.2. PAPERS PUBLISHED IN THE PERIOD 2007-2009

43