constructing a webpage

 Bernd has asked that I put the details of my analysis into a static webpage (not a blog post like this one), but in order to get things in order in my head, this is where I'm starting.

Sources

There are several sources of data that I'm using in this analysis.  I will endeavor to keep them carefully labeled:

"single particle": this set comes from a custom kumac.  One particle of a particular variety is thrown in the direction of the endcap while four muons are thrown to intersect the far end of the barrel.  This allows the vertex to be properly reconstructed.

"old MC": this refers to an old prompt photon sample as well as an old jet sample (I will have to check the precise timestamps and geometry files used, though certainly both are 2006)

"new MC": this refers to the not-yet-finished MC that Mike is generating on tier 2.  

"current data":this refers to the small set of runs that I currently have available on the MIT disk (these are, I believe, transverse)

"all data": this refers to all transverse and longitudinal runs (since I'm not doing an asymmetry, the relative orientations of the protons isn't critical, as far as I know).

 

Cuts

In creating my trees, I apply a set of four cuts - the candidate must have more than 1.0GeV pT in the most energetic tower and a total in four adjacent towers of at least 4.0GeV pT.  

Because of limited tracking, I also require the tower eta to be less than 1.5 (I need to check that this value is indeed done in tower eta and not particle).

To take advantage of the SMD I prefer the shower to be well away from the edge of the SMD plane, so I require the candidate to have a center not in the A or E subsectors.

 All other cuts are performed after I have made the trees.  The complete list of cuts has been shown multiple times, and will be included when I write the actual page.

 

Algorithm

For purposes of testing and development, I divided the old MC (and will repeat for the new MC when it becomes available) into two halves, one representing simulated data, the other representing actual data (putting various geant variables off limits until I check results at the end).

The first step is to correct back from the final distribution of measured pT to the final distribution of measured pT for the photons alone.  Schematically, we do this by reading the purity of each bin out from the simulated data half.  These purities turn out to be fairly low, which means the estimated number of photons in any bin (content * #simphotons/(#simphotons+#simjets) ) is very sensitive to the size of the background.  If it were done like this directly, we would always find roughly the distribution we 'put in' in the simulated half.  To avoid this, we replace #simjets with #simfinaljets/#simoffjets * #dataoff, where #dataoff is the number of events we measure after an set of cuts specifically designed to suppress prompt photons.  In this cut #simjets/(#simphotons+#simjets) is very nearly one, so it is insensitive to the number of photons. This allows us to constrain the number of jets independently.

The number of photon events in a particular bin after the cuts, then, is (#datafinal * #simphotons/(#simphotons+#simfinaljets/#simoffjets*#dataoff).  We can see the improvement here by altering the 'actual data' half of the MC to have larger or smaller fractions of photons in various bins, and verifying that the procedure always reconstructs the correct numbers.  (Three plots)

The cuts are applied to the simulated data half and