Select a trigger and charge combination to view QA summary plots. Each point on a plot is a mean value of that quantity for the given run, while the error is sqrt(nentries).
get_file_list.pl -distinct -keys 'orda(runnumber)' -cond 'production=P06ie,trgsetupname=ppProductionLong,sanity=1,tpc=1' -limit 0
This query yields 406 runs (3 with emc=0) which I process with star-submit-template. I'm using Murad's production of StJetSkimEvents to get all event-level info, so my chain is brutally simple: just StMuDSTMaker and a simple set of track quality cuts, viz.
/home/kocolosk/analysis/run6/may03
Once I've got these trees back at MIT I merge them with the jetSkimEvent trees using an index on (run,event) (see Common Analysis Trees). I also apply some more stringent cuts:
BJP1 (hardware & software & geometric) requirement, only use pions with dR > 1.5
fill yellow blue like unlike
7847 0.7681 0.4152 0.8136 0.2513
7850 0.0743 1.4901 1.1202 -0.9843
7851 -0.5521 -1.5695 -1.5313 0.7059
7852 -0.5546 -1.1228 -1.1148 0.4334
7853 -0.8047 2.7533 1.3496 -2.5720
7855 -0.1095 -1.0144 -0.7683 0.6419
7856 0.6243 -0.6925 -0.0455 0.9280
7858 1.3109 1.9636 2.3779 -0.4507
7863 0.6637 1.3537 1.3883 -0.4979
7864 0.2973 -0.8424 -0.3922 0.7972
7871 -1.7681 0.6963 -0.7669 -1.7202
7872 -1.4830 -0.4183 -1.2917 -0.7713
7883 0.7329 -0.4249 0.2105 0.8539
7886 0.4131 0.7406 0.8332 -0.2265
7887 0.8142 1.0339 1.3192 -0.1441
7896 -0.3938 -0.2628 -0.4542 -0.0927
7901 0.3990 0.0513 0.3428 0.2471
7908 -0.1043 -0.2753 -0.2721 0.1245
7909 0.2401 -1.4131 -0.8252 1.2060
7911 -1.8391 1.9288 0.0727 -2.6732
7913 -0.0822 0.7613 0.4930 -0.5827
7916 -0.3658 -1.9129 -1.5979 1.0991
7918 -1.1823 -0.9377 -1.5543 -0.1821
7921 -0.7642 1.2557 0.3598 -1.4019
7922 1.8057 0.0540 1.2658 1.2471
7926 -0.5481 -2.1827 -1.8897 1.1854
7944 0.0950 0.4734 0.4049 -0.2722
7949 -0.1770 0.6205 0.3156 -0.5489
7951 -0.2755 -0.0891 -0.2622 -0.1306
7952 2.2435 1.6046 2.8193 0.4314
7954 -1.2534 -0.5687 -1.2512 -0.4838
7957 -0.6056 -2.1501 -1.8931 1.1112
fill yellow blue like unlike
7847 -0.6434 2.3161 1.1484 -2.1363
7850 1.8693 -0.0728 1.2844 1.3589
7851 -1.2476 -1.4218 -1.9278 0.1198
7852 0.4635 -0.7461 -0.1245 0.9018
7853 0.2474 3.4829 2.6019 -2.3402
7855 0.6099 -0.3582 0.1694 0.7191
7856 -0.3472 -0.0663 -0.3147 -0.2412
7858 0.5772 0.0588 0.4406 0.3577
7863 1.2914 0.4583 1.2065 0.6182
7864 0.9606 -0.4055 0.3951 0.9624
7871 -1.6101 -0.2726 -1.3459 -0.9187
7872 0.5035 0.3827 0.6093 0.0966
7883 -0.7030 0.1324 -0.3950 -0.6150
7886 1.9632 1.2030 2.2793 0.5283
7887 1.1916 1.1695 1.7115 0.0042
7896 1.0944 -1.5376 -0.3071 1.8997
7901 -1.1326 -0.1455 -0.9498 -0.6828
7908 -0.1445 0.6122 0.3287 -0.5444
7909 -1.5394 -0.4124 -1.3598 -0.8080
7911 -2.4637 0.5309 -1.3758 -2.1097
7913 1.4852 -1.7004 -0.1439 2.2214
7916 1.3170 -1.4964 -0.1252 2.0055
7918 -0.0600 -2.4808 -1.8464 1.6679
7921 -0.6402 1.6446 0.6993 -1.5745
7922 -1.7100 -1.8694 -2.4555 0.1048
7926 -0.8854 -1.8649 -1.8962 0.7077
7944 0.7047 -0.3786 0.2404 0.7810
7949 -1.1675 0.2958 -0.6320 -0.9979
7951 -1.6275 0.5100 -0.7990 -1.4975
7952 1.3279 1.7884 2.2557 -0.3194
7954 -0.0209 -0.7787 -0.5861 0.5011
7957 2.2026 -1.9116 0.2045 2.9721
fill yellow blue like unlike
6988 2.8954 3.3548 4.2810 -0.3045
6990 -1.1356 0.0822 -0.7347 -0.8748
6992 2.2104 0.3517 1.9057 1.2493
6994 -0.3457 0.3976 0.0759 -0.6364
6995 0.0236 0.1136 0.1058 -0.0989
6997 0.6616 2.4959 2.2008 -1.2739
6998 -0.0607 0.5594 0.3556 -0.4868
7001 -0.2676 0.5874 0.2397 -0.6092
7002 -1.0157 -1.8627 -2.0789 0.4709
7032 -10.7190 9.6297 -0.9199 -13.3990
7034 2.6986 -1.2420 1.0235 2.7046
7035 -5.9795 5.8657 -0.0851 -8.3401
7048 -0.0177 -1.8298 -1.3617 1.2361
7049 -0.6182 -0.6846 -0.9274 0.0772
7051 1.2298 1.6795 2.1185 -0.3414
7055 0.7022 2.1289 2.0639 -1.0710
7064 -0.4975 1.9746 1.0530 -1.8834
7067 -2.7525 1.7490 -0.6884 -3.2186
7068 0.5999 -2.3405 -1.2244 2.1034
7069 1.6100 1.1541 1.9830 0.3217
7070 -0.0836 0.6656 0.4473 -0.4897
7072 0.1926 -0.7871 -0.3990 0.6684
7075 0.8745 -2.6544 -1.3421 2.4330
7079 0.1552 -1.5263 -1.0723 1.2324
7085 -0.0897 -0.2026 -0.2088 0.0906
7087 1.9237 0.4959 1.6501 1.0719
7088 0.3847 -0.4083 -0.0174 0.5569
7092 2.0581 -0.5933 1.0121 1.9479
7102 0.8406 0.3069 0.7570 0.5024
7103 -0.3106 -1.2057 -1.0763 0.6761
7110 2.6473 -0.2804 1.6718 2.0768
7112 1.5189 0.6019 1.4554 0.7426
7114 1.6817 0.9240 1.7917 0.5391
7118 -1.7624 0.7710 -0.5612 -1.6619
7120 -0.7654 0.2761 -0.2967 -0.7026
7122 0.0848 -0.6810 -0.3646 0.5913
7123 -1.6676 1.3976 -0.0291 -2.0208
7124 1.3759 -0.2691 0.6933 1.2215
7125 1.8810 -0.1630 1.1017 1.5199
7127 0.1055 1.6688 1.1933 -1.1824
7131 -0.1858 -0.0352 -0.2407 -0.1477
7133 0.8733 0.4679 1.0256 0.3223
7134 0.6636 -0.6857 -0.0360 0.9213
7151 -0.3299 -0.6315 -0.7004 0.2166
7153 -2.5741 -1.5110 -3.2579 -0.8339
7161 -0.9482 -0.8973 -1.2234 -0.1557
7162 -1.4261 -0.0548 -1.0450 -0.9047
7164 -0.2787 0.3688 0.1438 -0.5959
7165 0.1723 -1.3462 -0.7982 1.0477
7166 0.3072 2.2635 1.8098 -1.2695
7172 -0.6414 1.0035 0.2382 -1.1475
7237 0.8140 -1.9210 -0.7821 1.9362
7238 0.9147 0.1583 0.7514 0.5303
7249 0.8511 0.0570 0.6455 0.5322
7250 1.6782 -1.6771 0.1037 2.3099
7253 1.2592 0.8287 1.5859 0.2875
7255 -0.1376 0.1269 0.0300 -0.1948
7265 -2.7099 -1.0899 -2.5877 -1.2855
7266 -0.1283 0.7634 0.4338 -0.6211
7269 0.4862 -1.4204 -0.6823 1.3197
7270 -0.3694 -1.4968 -1.3117 0.7702
7271 2.1860 0.7501 2.0633 1.0176
7272 2.2742 -2.2593 0.0511 3.1922
7274 0.3692 1.9398 1.7348 -1.1304
7276 -1.9901 0.5043 -1.0389 -1.7776
7278 0.7444 -0.9791 -0.1687 1.2300
7279 -2.3442 0.9652 -0.9159 -2.3637
7300 -0.8765 1.6118 0.6363 -1.8350
7301 0.8553 -0.2668 0.3570 0.7616
7302 1.0808 -0.0070 0.7584 0.7798
7303 -0.7486 0.1757 -0.3854 -0.6549
7304 -0.7711 0.2501 -0.3825 -0.6815
7305 0.8851 0.7527 1.1938 0.0802
7308 -0.8466 0.7225 -0.0718 -1.1143
7311 0.2896 1.3780 1.1925 -0.7556
7317 -0.8703 -0.0998 -0.6921 -0.5361
7320 -1.3928 -2.0561 -2.4636 0.4663
7325 -0.9948 1.1141 0.0747 -1.5080
7327 -1.6448 0.1399 -1.0593 -1.2770
fill yellow blue like unlike
6988 1.5237 0.3259 1.3166 0.8438
6990 1.1388 -0.4585 0.5317 1.0481
6992 2.7809 1.8496 3.2777 0.6367
6994 0.4036 -0.0709 0.2152 0.3816
6995 0.0284 0.7506 0.5669 -0.5518
6997 -1.2708 1.5377 0.1390 -1.9290
6998 -0.2966 0.4639 0.1302 -0.4701
7001 0.8019 -0.9775 -0.1193 1.2528
7002 1.0862 -0.8451 0.1595 1.3065
7032 -7.2733 5.8628 -1.0782 -8.7338
7034 2.4731 -0.8650 1.1370 2.2900
7035 -5.7211 3.9134 -1.2690 -6.7932
7048 -0.9821 0.5199 -0.2484 -0.9325
7049 -2.4413 -1.6901 -2.8984 -0.5287
7051 0.1898 0.2886 0.3473 -0.0752
7055 2.4241 0.6951 2.2625 1.1398
7064 2.2608 -0.4958 1.2645 1.7928
7067 -0.9410 -0.7594 -1.1653 -0.2053
7068 0.2002 -0.1126 0.0601 0.2288
7069 0.6342 0.4304 0.7885 0.1599
7070 -1.2299 2.9297 1.2261 -2.8784
7072 -0.7843 1.6628 0.5990 -1.7002
7075 2.3895 -0.4305 1.3059 1.9201
7079 2.1971 -1.5796 0.3723 2.6978
7085 1.3459 -0.2317 0.7799 1.1367
7087 0.5258 0.2137 0.5551 0.1279
7088 -0.1856 -0.1438 -0.2435 -0.1064
7092 1.2742 -0.2971 0.6764 1.1774
7102 -0.1788 0.7376 0.3437 -0.5362
7103 -0.8089 0.0366 -0.5355 -0.6171
7110 2.6260 0.2961 2.0858 1.6233
7112 1.9199 -0.0349 1.3241 1.3657
7114 1.8546 0.6690 1.6611 0.8189
7118 -1.4017 0.8334 -0.4454 -1.6526
7120 -0.4001 -0.8129 -0.8129 0.3250
7122 0.5141 -0.2102 0.3512 0.6298
7123 -1.3042 1.8959 0.5395 -2.1551
7124 3.9571 0.3816 2.8666 2.6861
7125 -1.3947 0.3251 -0.8597 -1.3188
7127 2.2501 -0.6104 0.9831 2.1725
7131 -0.9200 -0.6472 -1.1191 -0.2102
7133 -1.1714 1.5062 0.0760 -1.9908
7134 0.6319 -1.4622 -0.6012 1.5632
7151 0.5372 -1.7916 -0.9108 1.6296
7153 -1.2329 0.7181 -0.3432 -1.3650
7161 1.3848 0.0689 1.0066 0.8943
7162 -2.3956 0.6038 -1.3699 -2.1493
7164 0.3508 1.6100 1.3783 -0.8174
7165 1.4134 -0.7712 0.4739 1.5248
7166 -0.3523 2.1152 1.2432 -1.6452
7172 -0.1923 3.1633 2.1209 -2.3379
7237 1.5778 -1.1650 0.2675 1.9640
7238 1.4628 3.2685 3.4046 -1.2492
7249 0.4261 0.1171 0.3816 0.2054
7250 1.2336 -1.0578 0.2506 1.5672
7253 0.6686 -1.3123 -0.3461 1.3663
7255 0.4010 0.1360 0.3899 0.1899
7265 0.1921 -1.1813 -0.6969 0.9601
7266 -0.6051 -0.5478 -0.8301 -0.0250
7269 0.1734 -0.5571 -0.2799 0.5063
7270 0.8825 -0.1983 0.4846 0.7612
7271 0.4453 -0.5304 -0.0492 0.6870
7272 -0.0131 -0.0176 -0.1003 0.0666
7274 2.5228 -0.2154 1.7141 1.8915
7276 -1.2212 2.1062 0.6034 -2.3542
7278 0.3318 0.0408 0.2745 0.1836
7279 -0.5121 -1.0690 -1.0773 0.3939
7300 -1.6321 0.7300 -0.6380 -1.6637
7301 -0.3916 -1.4341 -1.3355 0.7032
7302 1.4687 -1.3287 0.0978 1.9877
7303 0.4977 0.8000 0.8904 -0.2186
7304 -1.3292 -1.5037 -2.0497 0.1612
7305 0.2956 1.9077 1.6151 -1.1684
7308 -0.9212 -1.6537 -1.8057 0.5161
7311 0.0845 -1.0962 -0.7150 0.8359
7317 -2.3657 -0.6376 -2.1734 -1.1486
7320 -2.8559 1.1114 -1.2439 -2.7725
7325 0.3738 0.4026 0.5462 -0.0209
7327 -0.0750 -1.3466 -1.0211 0.8721
Update 2008-10-03: include the effect on A_{LL} from the uncertainty on the jet pT shift in the total point-to-point systematics.
Comparison to models obtained by sampling a_{LL} and parton distribution functions at the kinematics specified by the PYTHIA event:
Asymmetries are plotted versus the ratio of pion p_{T} and the p_{T} of the trigger jet.
Error bars on each histogram take multi-particle correlations into account when multiple pions from an event fall into the same bin. Here is the Δϕ distribution obtained from the data and compared to Monte Carlo:
Systematic uncertainties are dominated by the bias in the subprocess mixture introduced by the application of the jetpatch trigger. Uncertainty in the asymmetry of the PID background also contributes in the two highest z
bins. The full bin-by-bin systematic uncertainties are
π- = {9.1, 8.1, 6.1, 11.1} E-3
π+ = {14.8, 11.0, 6.6, 14.8} E-3
π- = {9.6, 9.5, 17.1, 14.9} E-3
π+ = {15.3, 13.0, 17.3, 21.8} E-3
I initially tried to estimate the bias from the JP trigger by applying the Method of Asymmetry Weights to PYTHIA. The next three plots show the Monte Carlo asymmetries after applying a) the minbias trigger, b) the jetpatch trigger, and c) the difference between a) and b):
a)
b)
c)
As you can see, the bias from this naïve approach is huge. It turns out that a significant source of the asymmetry differences is the fact that each of these bins integrates over a wide range in jet pT, and the mean jet pT in each bin is very different for MB and JP triggers:
We decided to factor out this difference in mean pT by reweighting the minbias Monte Carlo. This reweighting allows the trigger bias systematic to focus on the changes in subprocess mixture introduced by the application of the trigger. Here’s the polynomial used to do the reweighting:
Here are the reweighted minbias asymmetries and the difference between them and the jetpatch asymmetries:
The final bias numbers assigned using GRSV-STD are 6-15 E-3.
I calculate the background in my PID window using separate triple-Gaussian fits for π- (8.6%) and π+ (9.2%), but I assume a 10% background in the final systematic to account for errors in this fit:
Then I shift to a sideband [-∞, -2] and calculate an A_{LL}:
The relation between measured A_{LL} and the “true” background-free charged pion A_{LL} is
so the systematic uncertainty we assign is given by
and is ~9 E-3 in the highest bin, 1.5-4 E-3 elsewhere.
I used the corrections to measured jet pT that Dave Staszak determined by comparing PYTHIA and GEANT jets link to correct my measured jet pTs before calculating z
. The specific equation is
p_{T,true} = 1.538 + 0.8439*p_{T,meas} - 0.001691*p_{T,meas}**2
There is some uncertainty on the size of these shifts from a variety of sources; I took combined uncertainties from the 2006 preliminary jet result (table at http://cyclotron.tamu.edu/star/2005n06Jets/PRDweb/ currently lists the preliminary uncertainties). The dotted lines plot the 1σ uncertainties on the size of the jet pT shift:
Next I used those 1σ pT shift curves to recalculate A_{LL} versus z
. The filled markers use the nominal pT shifts. The open markers to the left plot the case when the size of the shift is large (that is, the 1σ corrected jet pT is lower lower than the nominal case, which causes some migration from nominally higher z
into the given bin). The open markers to the right plot the case where the shift is small (corrected jet pT closer to measured).
In short: low markers represent migration from lower z
, high markers represent migration from higher z
.
No assignment of systematic at the moment. If I were to assign a systematic based on the average difference between the nominal and min/max for each bin I’d get
I assign a systematic based on the average difference between the nominal and low/high for each bin; this ends up being 3-16 E-3.
Murad’s detailed documentation
A pT-independent systematic uncertainty of 9.4 E-4 is assigned.
Analysis of beam polarization vectors leads to tan(θB)tan(θY)cos(ΦB-ΦY) = 0.0102
. I calculated an Aσ from transverse running:
The small size of the non-longitudinal beam components mean that even the Aσ in the case of π- leads to a negligible systematic on A_{LL}. A pT- and charge-dependent systematic of 1.4-7.3 E-4 is assigned.
The following are summary results (val ± err and χ2) from straight-line fits to single-spin asymmetries versus fill:
π- | val ± err | χ2 (37 d.o.f.) |
Y | -4.8 ± 3.0 | 63.74 |
B | 0.8 ± 3.1 | 34.46 |
L | 6.7 ± 7.4 | 46.21 |
U | 9.9 ± 7.5 | 52.51 |
π+ | val ± err | χ2 (37 d.o.f.) |
Y | -1.2 ± 2.9 | 53.65 |
B | 0.5 ± 3.0 | 43.45 |
L | 3.2 ± 7.2 | 55.03 |
U | 2.0 ± 7.3 | 41.72 |
There’s a hint of an excess of uu and/or ud counts for π-, but no systematic is assigned.