- rcorliss's home page
- Posts
- 2013
- January (1)
- 2012
- 2011
- 2010
- December (1)
- October (2)
- September (2)
- August (2)
- June (2)
- May (3)
- April (3)
- March (5)
- February (2)
- January (8)
- 2009
- December (5)
- November (1)
- October (7)
- September (10)
- August (4)
- July (3)
- May (1)
- February (1)
- January (1)
- 2008
- 2007
- My blog
- Post new blog entry
- All blogs
timeline
timeline 1.1
----------
A Timeline
Finish-By Date | Task | Computing Task |
May 23 | Assemble this timeline. | Generate single-particle MC. This includes 500k of charged pions, 50k each of neutral pions and single photons. These are almost done at this point, and it looks like the total dataset will be ~25GB. |
May 30 | Adjust existing cuts using the new larger single particle samples. This should be straightforward. Also, finish determining how large a pythia sample I need, etc. | Be halfway through generating a pythia sample to further work on the cuts. I have no idea how to generate this, nor do I know how much I will need. I know that I can select only prompt photon processes and I know that I can check the pythia record later to see the initial momentum of the prompt photon in the partonic interaction. I ahve no idea how much space this will need. |
Jun 6 | Continue adjusting, make sure the code is ready for more crowded events. This likely means moving back to the GammaCandidate version so that I can use existing access to the pythia record. | Generate a pythia dataset of reasonable size. (see above) |
Jun 13 | Get code working on Pythia and have progress toward updated cuts. | |
Jun 20 | Finalize cuts and have some measure of efficiency for single photons and the various backgrounds. | |
Jun 27 | Test variations of the clustering algorithm to see how stable it is with respect to changes of various thresholds inside it. Compare data and MC on this and other axes. | Run algorithm on some set of 2006 data. I don't know where to find this and don't know what selection criteria to use to choose runs. |
Jul 4 | Finalize adjustments on clustering algorithm for now. Continue data-MC comparisons to validate cuts, changing if something horrible comes up. | Continue running on 2006 dataset |
Jul 11 | Have rough number for cross section | Complete running on 2006 dataset |
Jul 18 | Attempt to deal with errors. Energy uncertainty? Uncertainty in efficiencies? Something else? | |
Jul 25 | Continue dealing with errors. | |
Aug 1 | Stop dealing with errors. |
B Presentation milestones
It has been suggested that I should present the results of this attempt to the Spin PWG at various times through the next few months, highlighting:
* having a working gamma algo based on Pythia M-C (this would occur no sooner than jun 20)
* determination of the signal and background efficiencies of the finalized algorithm (this would occur no sooner than jun 20)
* vetting through a MC vs. data comparison (this would occur no sooner than jul 4 )
* status of error analysis (this would occur no sooner than jul 18, likely jul 25)
* final experimental results (this would occur no sooner than aug 1)
C List of items needed for cross-section
datasets
- single particle mc
- pythia mc
-- efficiency for prompt photons
-- efficiency for neutral pions
-- efficiency for electrons
-- efficiency for hadrons
-- various plots of the variables used in cuts
- actual data
-- list of runs to use
-- luminosity numbers for these runs
-- various plots of the variables used in cuts
theory
- pion cross-section in used eta range (1.0-1.5?)
- expected photon cross-section in same range
- non-pizero hadron cross-section in same range?
D Things to get from other people
theory cross sections as above
gamma energy scale - why should this fail for single photons if it works for pizeros?
efficiencies other than algorithm
- trigger
- other
all information about real data
known errors
timeline 1.0
----------
I've been asked to construct a timeline for getting at a photon cross section. Generate single particle MC ~1wk, Generate pythia sample ~2wks, find actual data ?. Tune cuts on bigger single particle dataset ~2wks concurrent with generating pythia, tune cuts on pythia sample ~2wks. At some point in this I need to more thoroughly vet my SMD clustering algorithm ~2wks, along with more generally vetting the various cuts with the other photon people ~2wks. Once the cuts (and I shudder to use them instead of something that can make better use of the correlations) are frozen, it's easy to compute the efficiency for photons and pions, and I can use those numbers along with the luminosity of the data sample, the number of survivors from the data sample, and the number of pions expected in NLO calculations to get at the prompt photon cross section. This doesn't include any dealing with errors, which are probably ~3wks to get even close to understanding.
may 23: finish generating single particle MC (.5million piminus, 50k photons, 50k pizeros)
may 30: finish first half of pythia sample.
jun 6: finish pythia sample. Finish tuning cuts for single particle MC
jun 13:
jun 20: finish tuning cuts on the pythia sample.
jun 27: finish running on a subset of 2006 data. be halfway through adjustments of clustering algorithm
jul 4: have clustering algorithm sufficiently vetted
jul 11: have cuts sufficiently vetted
jul 18: finish running on data. Have computed rough cross section. errors
jul 25: errors
aug 1: errors
- rcorliss's blog
- Login or register to post comments