Offline Software

Embedding

Welcome to the STAR Embedding Pages!

Embedding data are generally used in STAR experiments for detector acceptance & reconstruction efficiency study. In general, the efficiency depends on running conditions, particle types, particle kinematics, and offline software versions. In principle, each physics analysis will need to formulate its own embedding requests by providing all the above relevant information. In STAR Collaboration, the embedding team is assigned to process those embedding requests and provide these embedding data, you can find out how the embedding team works in the Embedding structure page.

Over the past years, lots of embedding requests from different PWG's have been processed by the embedding team, please find the list in STAR Simulations Requests interface. If you want to look at some of these data, but do not know where these data are stored, please go to this page. If you can not find similar embedding request, you need to formulate your own embedding request for your particular study, please go to this page for more information about how to formulate an embedding request.

Please subscribe to the embedding mailing list if you are interested with embedding discussion: Starembd-l@lists.bnl.gov
And please join our weekly embedding meetinghttps://drupal.star.bnl.gov/STAR/node/65992

Finding existing embedding data

Embedding data were produced for each embedding request in the STAR Simulations Request page.
Normally, they will be stored in RCF NFS disks for a while for end users to do their analysis.
However, NFS disk space is very limited, and we have new requests constantly, the data will be finally moved
from disks to HPSS for permanent storage on tape, but they can be restaged to disk for analysis later.

In order to find the existing embedding either on disks or in HPSS. Please follow the procedures below:

1) Find the request ID of a particular request that you are interested in, in the STAR Simulations Request page.
    You can use the "Search" box at the top right of this page. Once you find the entry, look at the 'Request History' tab for more information, usually the original NFS data directory (when the data was first produced) can be found there.

2) Currently, the RCF NFS disks for embedding data are /star/data105, /star/embed and /star/data18.
    For the data directories of each request, please logon to RCF, look at '/star/data105', '/star/embed' or '/star/data18' to see whether the following directories exist:

/star/data105/embedding/${Trigger set name}/${Embeded Particle}_${fSet}_${RequestID}/
/star/embed/embedding/${Trigger set name}/${Embeded Particle}_${fSet}_${RequestID}/
/star/data18/embedding/${Trigger set name}/${Embeded Particle}_${fSet}_${RequestID}/

3) If they exist, you can further check whether all the ${fSet} from 'fSet min' to 'fSet max' are there. If all exist, 
    you can start to use it. If none of them are there or only some fraction of 'fSet' are there, write to the embedding list
    and ask the Emebedding Coordinator to restage it to there from HPSS.

4) If you can not find the embedding data in this directory, the data must be in HPSS. Unfortunately, there is not a full list of data stored in HPSS yet. (For some data produced at RCF, Lidia maintains the list of embedding samples produced at RCF.) Please write to the embedding list, provide the request ID, Particle Name, file type (minimc.root, MuDst.root or event/geant.root), and ask the Embedding Coordinator to restage the data to NFS disk for you.


Operations (OBSOLETE)

The full list of STAR embedding requests (Since August, 2010):
http://drupal.star.bnl.gov/STAR/starsimrequest
The operation status of each request can be viewed in its page and history.

The information below (and in sub-pages) are only valid to OLD STAR embedding requests.

Request status

- Currently pending requests

- Old Requests summary (circa Nov 2006)

New requests

 

 

Current Requests

Current Requests were either submitted using the cgi web interface before it had to be removed (September 2007) or via email to the embedding-hn hypernews list.

As such there is not a single source of information. The excel spreadsheet here is a summary of the known requests as of August 2008 (also pdf printout for those without access to excel). The co-ordinator should keep this updated.

Heavy flavour have also kept an up to date page with their extensive list of requests. See here.

[The spreadsheet was part of a presentation (ppt|pdf) in August 2008 on the state of embedding at the end of my tenure as EC - Lee Barnby]

New Requests

In the near future we hope to have a dedicated drupal module available for entering embedding (and simulation) requests. This will have fields for all the required information and a work flow which will indicate the progress of the request. At the time of writing (August 2008) this is not available. A workable interim solution is to make requests using this page by selecting the 'Add Child Page' link. One should then write a page with the information for the request. Follow-up by members of the embedding team and requestors canthen use the 'Comment' facility on that page.

The following is required

  • pt range
  • pt distribution (flat, exponential)
  • rapidity
  • ...
  • (to be completed)

 

Old Requests

Overview



This page is intended to provide details of embedding jobs currently in production.

ID Date Description Status Events Notes
1121704015 Mon Jul 18 12:26:55 2005 J/Psi Au+Au Open   pgw Heavy
1127679226 Fri Sep 16 20:34:17 2005 Photon embedding for 62 GeV AuAu data for conversion analysis Open   pgw High Pt
1126917257 Sun Sep 25 16:13:46 2005 AMPT full chain Open   pgw EbyE
1130984157 Wed Nov 2 21:15:57 2005 pizero embedding in d+Au for EMCAL Open   pgw HighPt
1138743134 Tue Jan 31 16:32:14 2006 dE/dx RUN4 @ high pT Done   pgw Spectra
1139866250 Mon Feb 13 16:30:50 2006 Muon embedding RUN4 Test Sample   pgw Spectra
1139868572 Mon Feb 13 17:09:32 2006 pi+/-, kaon+/-, proton/pbar embedding to RUN5 Cu+Cu 62 GeV data open   pgw Spectra
1144865002 Wed Apr 12 14:03:22 2006 Lambda embedding for spectra (proton feeddown) Open   pgw Strangeness
1146151888 Thu Apr 27 11:31:28 2006 pi,K,p 200 GeV Cu+Cu Open   pgw Spectra
1146152319 Thu Apr 27 11:38:39 2006 K* for 62 GeV Cu+Cu Open   pgw Spectra
1146673520 Wed May 3 12:25:20 2006 K* for 200 GeV Cu+Cu Done   pgw Spectra
1148574792 Thu May 25 12:33:12 2006 Anti-alpha in 200 GeV AuAu Closed   pgw Spectra
1148586109 Thu May 25 15:41:49 2006 He3 in 200GeV AuAu Test Sample   pgw Spectra
1148586313 Thu May 25 15:45:13 2006 Deuteron in 200GeV AuAu Done   pgw Spectra
1154003633 Thu Jul 27 08:33:53 2006 J/Psi embedding for pp2006 Open   pgw Heavy
1154003721 Thu Jul 27 08:35:21 2006 Upsilon embedding for pp2006 Open   pgw Heavy
1154003879 Thu Jul 27 08:37:59 2006 electron embedding for Cu+Cu 2005 Test Sample   pgw Heavy
1154003931 Thu Jul 27 08:38:51 2006 pi0 embedding for Cu+Cu 2005 for heavy flavor group open   pgw Heavy
1154003958 Thu Jul 27 08:39:18 2006 gamma embedding for Cu+Cu 2005 for heavy flavor group open   pgw Heavy
1154004033 Thu Jul 27 08:40:33 2006 electron embedding for p+p 2005 for heavy flavor group (e-h correlations) Test Sample   pgw Heavy
1154004074 Thu Jul 27 08:41:14 2006 pi0 embedding for p+p 2005 for heavy flavor group (e-h correlations) Test Sample   pgw Heavy
1154626301 Thu Aug 3 13:31:41 2006 AntiXi Cu+Cu (P06ib) Done   pgw Strangeness
1154626418 Thu Aug 3 13:33:38 2006 Xi Au+Au (P05ic) Done   pgw Strangeness
1154626430 Thu Aug 3 13:33:50 2006 Omega Au+Au (P05ic) Done   pgw Strangeness
1156254135 Tue Aug 22 09:42:15 2006 Phi in pp for spin-alignment open   pgw Spectra
1163565625 Tue Nov 14 23:40:25 2006 muon CuCu 200 GeV open   pgw Spectra
1163627909 Wed Nov 15 16:58:29 2006 muon CuCu 62 GeV open   pgw Spectra
1163628205 Wed Nov 15 17:03:25 2006 phi CuCu 200 GeV Test Sample   pgw Spectra
1163628539 Wed Nov 15 17:08:59 2006 K* pp 200 GeV (year 2005) Open   pgw Spectra
1163628764 Wed Nov 15 17:12:44 2006 phi pp 200 GeV (year 2005) Open   pgw Spectra

 

runs for request 1154003879

min-bias run list from Anders.


6031103 6031104 6031105 6031106 6031113 6032001 6032003 6032004 6032005 6032011 6034006 6034007 6034008 6034009 6034014 6034015 6034016 6034017 6034108 6035005 6035006 6035007 6035009 6035010 6035011 6035012 6035013 6035014 6035015 6035016 6035026 6035027 6035028 6035030 6035032 6035036 6035108 6035111 6036012 6036014 6036016 6036017 6036019 6036020 6036021 6036022 6036024 6036025 6036028 6036036 6036039 6036043 6036044 6036045 6036098 6036102 6036103 6036104 6036105 6037009 6037010 6037013 6037014 6037015 6037016 6037017 6037018 6037019 6037025 6037026 6037028 6037029 6037030 6037031 6037033 6037039 6037040 6037046 6037047 6037048 6037049 6037050 6037053 6037054 6037071 6037073 6037075 6037077 6038085 6038088 6039033 6039034 6039040 6039132 6039134 6039135 6039136 6039138 6039139 6039141 6039142 6039143 6039144 6039145 6039154 6040001 6040003 6040004 6040007 6040008 6040009 6040026 6040043 6040044 6040045 6040048 6040050 6040051 6040052 6040054 6040055 6040056 6040057 6040058 6041014 6041015 6041020 6041021 6041022 6041023 6041025 6041027 6041028 6041031 6041034 6041035 6041036 6041062 6041063 6041064 6041065 6041066 6041091 6041092 6041093 6041097 6041099 6041102 6041103 6041105 6041118 6041120 6042003 6042006 6042008 6042009 6042010 6042012 6042014 6042015 6042016 6042018 6042024 6042046 6042047 6042048 6042049 6042052 6042054 6042055 6042057 6042059 6042060 6042101 6042105 6042109 6042110 6042111 6042112 6042113 6043010 6043011 6043014 6043017 6043019 6043020 6043021 6043022 6043026 6043027 6043039 6043043 6043045 6043054 6043057 6043058 6043060 6043063 6043080 6043082 6043085 6043090 6043094 6043095 6043097 6043098 6043100 6043101 6043111 6043112 6043113 6043114 6043115 6043117 6043118 6043119 6043120 6043121 6043122 6043123 6044001 6044010 6044011 6044012 6044015 6044016 6044020 6044024 6044025 6044026 6044027 6044028 6044044 6044045 6044046 6044047 6044049 6044050 6044053 6044054 6044055 6044058 6044059 6044075 6044079 6044081 6044083 6044084 6044085 6044086 6044088 6044089 6044090 6045008 6045026 6045027 6045029 6045033 6045042 6045070 6045072 6045075 6045076 6045077 6045078 6045079 6045080 6045081 6045082 6046004 6046005 6046014 6046016 6046017 6046018 6046022 6046028 6046035 6046037 6046038 6047017 6047020 6047022 6047025 6047026 6047037 6047039 6047040 6047044

Submit a new embedding request

Before submitting a new request, a double-check in the simulation request page is recommended, to see whether there are existing requests/data can be used. If not exist, one need to submit a new request. Please first read the 'Appendix A' in the embedding structure page for the rules to follow.
If the details of the embedding request has been thoroughly discussed within the PWG and hence approved by the PWG conveners. Please the PWG convener add a new request in the simulation request page and input all of the details there.

There are some key information that must be provided for each embedding request. (If you can not find some of the following items in the form, simply input it in the 'Simulation request description' box.)

Detailed information of the real data sample (to be embedded into).

  • Trigger sets name. for example, "AuAu_200_production_mid_2014", a full list of STAR real data trigger sets can be found in the data summary page.
  • file type. for example, "st_physics", "st_ht", "st_hlt", "st_mtd", "st_ssdmb". Please note that ONLY those "st_*_adc" files can be used for embedding production. Due to limitation of disk space, we usually sample 100K events (~200-500 daq files, depending on trigger and other event cuts), and this set of daq files will be embedded multiple times to reach the desired full statistics (say 1M).
  • Production tag. for example, "P15ic"
  • Run range with list of bad runs, or a list of good runs. for example, "15076101-15167014"
  • trigger IDs of the events to be embedded. for example, HT1 "450201, 450211"
  • vertex cut, and vertex selection method. for example, "|Vertex_z|<30cm", "Vr<2cm", "vertex is constrained by VPD vertex, |Vz-Vz_{VPD}|<3cm", "PicoVtxMode:PicoVtxVpdOrDefault, TpcVpdVzDiffCut:6" or "default highest ranked TPC vertex".

  • Other possible event cuts, for example, cut on refmult or grefmult, "refmult>250"
  • Each request can only have ONE type of dataset described above.

Details for simulation and reconstruction.

  • Particle type and decay mode (for unstable particle). for example, "Jpsi to di-muon"
  • pT range of input particle and the distribution. for example, "0-20GeV/c, flat", or "0-20GeV/c exponential"
  • pseudo rapidity or rapidity range (the distribution is flat), for example, "pseudo-rapidity eta, -1 to 1", or "rapidity, y, -1 to 1". Please do make clear it is "PSEUDO" rapidly or rapidity, as they are quite different quantities.
  • Number of MC particles per event. Usually "5% of refmult or grefmult" is recommended, one can also assign a fixed number, for example "5 particle per event" for pp collisions.
  • For event generator embedding request, like Pythia/HIJING/StarLight in zero-bias events, please provide the generator version at least. For example, "Pythia 8.1.62". The PWG embedding helper or PA's are fully responsible to tune-up the parameters for the generator in such embedding requests.
  • One can add special requirement for production chain in 'BFC tags' box, for example turn off IFT tracking in Sti.
  • Please indicate whether EMC or BTOF simulator is needed.

Finally, please think carefully about the number of events! The computing resources (i.e. the CPU cores and storage) are limited ! 


It is acceptable to modify the details of the request afterwards, although it will be of great help if all above detailed information can be provided when a new request is submitted, in order to avoid the time waste in communications. If this is inevitable, please notify the Embedding Coordinator immediately if the details of a request is modified, especially when the request is opened.


Weekly embedding meeting (Tuesday 9am BNL time)

 
We start weekly embedding meeting at Tuesday 9am US eastern time, to discuss all embedding related topics. 
The Zoom link can be found in below:

Topic: STAR weekly embedding meeting
Time: Tuesday 09:00 AM Eastern Time (US and Canada)

Join ZoomGov Meeting
https://bnl.zoomgov.com/j/1606384145?pwd=cFZrSGtqVXZ2a3ZNQkd1WTQvU1o0UT09

 Meeting ID: 160 638 4145
 Passcode: 597036

Meetings in 2024:
   

  • Meeting August 6, 2024
  • Recording: https://bnl.zoomgov.com/rec/share/Tio6ajxy1IaSYTw4l6-H5RiqYAASXg0OBHcpClAgyH-MwqTcTAmg17MP4cMfr296.8s9Sjr89YoNlZUuD

           Passcode: C*K#.$2U
          Agenda:1) Status and planning of embedding production 

   2) planning of parallel production with deputies

*********************************Meeting in 2023***************************

Agenda: 1) Status of embedding (11.5 GeV full sample produced, start 17.3GeV testing after rcf is back )
              2) embedding production planning 
 


Agenda:
1)  Embedding status and feedback- Xianglei  (9.2GeV sample produced, 11.5 GeV starts retuning )
2)Reproduction request of run 12 pp 200 GeV embedding  -Youqi
1) General introduction,   Qinghua
2) Embedding status & planning, Xianglei
3) Any other topics
 

Work Planned (OBSOLETE)

Based on:
  • An initial meeting held during QM06 (Chair: J.Lauret, L. Barnby, O, Baranikova, A. Rose)
  • Email exchange and notes from the meeting (From: J. Lauret, Date: 11/20/2006 22:36, Subject: Summary of our embedding meeting)
  • Further EMails from L. Barnby in embedding-hn (Date: 1/18/2007 15:12, Thread ID 84)
the following task list has been defined.

ID
Task
% Complete Duration Start Finish Assigned people
1
General QA consolidation
28% 109 days? Thu 11/16/06 Tue 4/17/07  
2            
3
Documentation
25% 37 days Mon 12/18/06 Tue 2/6/07  
4
Port old QA documentation to Drupal, define hierarchy
50% 4 wks Mon 12/18/06 Fri 1/12/07 Cristina Suarez[10%]
5
Add general documentation descriptive of the embedding purpose
0% 2 days Mon 1/15/07 Tue 1/16/07 Lee Barnby[10%],Andrew Rose[10%]
6
Add documentation as per the embedding procedure, diverse embedding
0% 2 days Mon 1/15/07 Tue 1/16/07 Lee Barnby[10%],Andrew Rose[10%]
7
Import PDSF documentation into Drupal
0% 1 wk Wed 1/17/07 Tue 1/23/07 Andrew Rose[10%]
8
Review and adjust documentation
0% 1 wk Wed 1/24/07 Tue 1/30/07 Olga Barranikova[10%],Andrew Rose[5%],Lee Barnby[5%]
9
Deliver documentation to collaboration for comments
0% 1 wk Wed 1/31/07 Tue 2/6/07 STAR Collaboration[10%]
10
Drop all old documentation, adjust link (redirect)
0% 1 day Wed 1/31/07 Wed 1/31/07 Andrew Rose[15%],Jerome Lauret[15%]
11            
12
Line of authority, base conventions
84% 52 days Thu 11/16/06 Fri 1/26/07  
13
Meeting with key personnel
100% 1 day Thu 11/16/06 Thu 11/16/06 Jerome Lauret[10%],Olga Barranikova[10%],Andrew Rose[10%],Lee Barnby[10%]
14
Define responsibilities and scope of diverse individual in the embedding team
100% 1 mon Mon 12/4/06 Fri 12/29/06 Jerome Lauret[15%],Olga Barranikova[6%]
15
Define file name convention, document final proposal
50% 2 wks Mon 1/15/07 Fri 1/26/07 Jerome Lauret[6%],Lee Barnby[6%],Lidia Didenko[6%],Andrew Rose[6%]
16            
17
Collaborative work
45% 60 days Mon 1/22/07 Fri 4/13/07  
18
General Cataloguing issues
0% 9 days Mon 1/29/07 Thu 2/8/07  
19
Test Catalog registration, adjust as necessary
0% 4 days Mon 1/29/07 Thu 2/1/07 Lidia Didenko[20%],Jerome Lauret[20%]
20
Extend Spider/Indexer to include embedding registration
0% 1 wk Fri 2/2/07 Thu 2/8/07 Jerome Lauret[10%]
21
Bug tracking, mailing lists and other tools
69% 60 days Mon 1/22/07 Fri 4/13/07  
22
Re-enable embedding list, establish focus comunication at PWG level and user level
75% 3 mons Mon 1/22/07 Fri 4/13/07 Jerome Lauret[10%]
23
Establish embedding RT system queue
0% 1 day Tue 1/23/07 Tue 1/23/07 Jerome Lauret[5%]
24
Exercise embedding RT queue, adjust requirement
0% 4 days Wed 1/24/07 Mon 1/29/07 Andrew Rose[10%]
25
Establish data transfer scheme to a BNL disk pool
0% 22 days Mon 1/22/07 Tue 2/20/07  
26
Define requirements, general problems and issues
0% 1 wk Mon 1/22/07 Fri 1/26/07  
27
Add data pool mechanism at BNL, transfer with any method
0% 1 wk Mon 1/29/07 Fri 2/2/07  
28
Establish security schem, HPSS auto-synching
0% 1 wk Mon 2/5/07 Fri 2/9/07  
29
Test on one or more sites (non-PDSF)
0% 1 wk Mon 2/12/07 Fri 2/16/07  
30
Integrate to all participating sites
0% 1 wk Mon 2/12/07 Fri 2/16/07  
31
Document data transfer schemeand procedure
0% 2 days Mon 2/19/07 Tue 2/20/07  
32            
33
CVS check-in and cleanup
4% 17 days? Mon 1/22/07 Tue 2/13/07  
34
Initial setup, existing framework
0% 17 days Mon 1/22/07 Tue 2/13/07  
35
Define proper CVS location for perl, libs, macros
0% 1 day Mon 1/22/07 Mon 1/22/07 Jerome Lauret[10%],Andrew Rose[10%],Lee Barnby[10%]
36
Add existing QA macros to CVS
0% 1 day Tue 1/23/07 Tue 1/23/07 Andrew Rose[20%]
37
Checkout and test on +1 site (non-PDSF), adjust as necessary
0% 1 wk Wed 1/24/07 Tue 1/30/07 Lee Barnby[10%]
38
Bootstrap on +1 site / remove ALL site specifics references
0% 1 wk Wed 1/31/07 Tue 2/6/07 Cristina Suarez[10%]
39
Commit to CVS, verify new scripts on all sites, final adjustments
0% 1 wk Wed 2/7/07 Tue 2/13/07 Cristina Suarez[10%],Andrew Rose[10%],Lee Barnby[10%]
40
QA and nightly tests
17% 7 days? Mon 1/22/07 Tue 1/30/07  
41
Establish a QA area in CVS
100% 1 day? Mon 1/22/07 Mon 1/22/07  
42
Check existing QA suite
0% 1 wk Wed 1/24/07 Tue 1/30/07  
43            
44
Development
0% 62 days Mon 1/22/07 Tue 4/17/07  
45
General QA consolidation
0% 10 days Wed 1/31/07 Tue 2/13/07  
46
Gather feedback from PWG, add QA tests relevant to Physics topics
0% 2 wks Wed 1/31/07 Tue 2/13/07  
47
Establish nightly test framework at BNL for embedding
0% 1 wk Wed 1/31/07 Tue 2/6/07  
48
General improvements
0% 35 days Mon 1/22/07 Fri 3/9/07  
49
Requirements study for an embedding request interface
0% 2 wks Mon 1/29/07 Fri 2/9/07 Andrew Rose[10%],Jerome Lauret[10%]
50
Develop new embedding request form compatible with Drupal module
0% 4 wks Mon 2/12/07 Fri 3/9/07 Andrew Rose[10%]
51
Test new interface, import old tasks (historical purposes)
0% 5 days Mon 1/22/07 Fri 1/26/07 Andrew Rose[10%],Cristina Suarez[10%]
52
Distributed Computing
0% 20 days Wed 2/14/07 Tue 3/13/07  
53
Use SUMS framework to submit embedding, establish first XML
0% 1 wk Wed 2/14/07 Tue 2/20/07 Lee Barnby[10%]
54
Test on one site
0% 1 wk Wed 2/21/07 Tue 2/27/07 Lee Barnby[10%]
55
Test on all sites, adjust as necessary
0% 2 wks Wed 2/28/07 Tue 3/13/07 Cristina Suarez[10%],Andrew Rose[10%],Lee Barnby[10%]
56
Gridfication
0% 25 days Wed 3/14/07 Tue 4/17/07  
57
Test XML using Grid policy (one site)
0% 1 wk Wed 3/14/07 Tue 3/20/07  
58
Establish test of data transfer method, GSI enabled HPSS access possible
0% 1 wk Wed 3/21/07 Tue 3/27/07  
59
Regression and stress test on one site
0% 1 wk Wed 3/28/07 Tue 4/3/07  
60
Test on +1 site, infrastructure consolidation
0% 2 wks Wed 4/4/07 Tue 4/17/07  
61            
62
Embedding operation
25% 261 days? Mon 1/1/07 Mon 12/31/07  
63
PDSF support
50% 261 days? Mon 1/1/07 Mon 12/31/07 Andrew Rose[10%]
64
BHAM support
10% 261 days? Mon 1/1/07 Mon 12/31/07 Lee Barnby[10%]
65
UIC Support including QA
15% 261 days? Mon 1/1/07 Mon 12/31/07 Olga Barranikova[5%],Cristina Suarez[10%]

Chain check, momentum issues

The below plots were the basis for the second You do not have access to view this node as a follow of the You do not have access to view this node.

Olga versus Victor plots - Are they consistent?

Olga Eloss proximity
Eloss proximity from Olga
Olga Eloss idtruth
Eloss idtruth from Olga
Victor's primaries P diff
Victor's P diff (proximity)
Victor's idtruth P diff
Victor's P diff (idtruth)

idtruth versus proximity

Victor proximity
Victor's proximity plot
Victor idtruth
Victor's idtruth

bfcMixer.C Reshape

b>Date: Wednesday, 18 April 2007
Time: 12:21:45
Topic: Embedding Reshape

18 April 2007 12:21:56

Talked to Yuri on Monday (16th)

He would like 3 things worked on.

1. Integration of MC generation part into bfcMixer.C

Basically all kumac commands can be done in macro using gstar.

These would become part of "chain one"

Also need to read in a tag or MuDst file to find vertex to use for generating particles.

Can probably see how this works from bfc.C itself as bfc.C(1) creates particles and runs them through reconstruction.

- actually I could not it is inside bfc.C or StBFChain because it is part of St_geant_Maker

2. Change Hit Mover so that it does not move hits derived from MC info (based on ID truth %age)

3. [I forgot what 3 was!]

Rough sketch of chain modifications for #1

Current bfcMixer

(StChain)Chain

(StBFChain)daqChain<--daq file

(StBFChain simChain<--fz file

                                <---.dat file with vertex positions        

                        MixerMaker

(StBFChain)recoChain

New bfcMixer

(StChain)Chain

(

StBFChain)daqChain<--daq file

                (StBFChain)simChain

                |

                Geant-?-SetGeantMaker<--tags file

                

                MixerMaker

(StBFChain)recoChain

Break down into sub-tasks.

a) Run bfcMixer.C on a daq file with an associated fz and data file (to check that it works!)

b) Ignore fz file and generate MC particle (any!) on the fly

c) reading from tags file generate MC particles at desired vertex & with desired mult.

d) tidy up specify parameter interface (p distn, geant ID etc.)

Embedding Procedures

Overview




This document lists the procedures used for requesting embedding, running embedding jobs, and retrieving data. Please note that some steps require privileged accounts.


Embedding Documentation

Overview



The purpose of embedding is to provide physicists with a known; Monte Carlo tracks, where the kinematics and particle type is known exactly, are introduced into the realistic environment seen in the analysis - a real data event. How reconstruction preforms on these "standard candle" tracks provides a baseline which can be used to correct for acceptance and effciency effects in various analyses.

In STAR, embedding is achieved with a set of software tools - Monte Carlo simulation software, as well as the tools for real data event reconstruction - accessed through a suite of scripts. These documents attempt to provide an introduction to the scripts and their use, but the STAR collaborator is referred to the separate web pages maintained for Simulations and Reconstruction for deeper questions of the processes involved in those respective tasks.

Please note, the general procedure for embedding is:
  • The Physics Working Group Convenor submits an embedding request. The request will be assigned a request number and priority by the Analysis Coordinator, see this page for more information.
  • The Embedding Coordinator will coordinate the distribution of embedding requests. Users who wish to run their own embedding are encouraged to do so - but it must be verified with the Embedding Coordinator.
  • The Embedding Coordinator is responsible for the QA of all embedding requests. When the Embedding Coordinator has verified that the test sample produced is acceptable, full production is authorized.


For EC and ED(s) (OBSOLETE)

Embedding instructions for EC and ED(s)

    Last updated on Sept/06/2011 by Christopher Powell
 

    Current Embedding Coordinator (EC): Xianglei Zhu (zhux@rcf.rhic.bnl.gov)

    Current Embedding Deputy (ED): 

    Current PDSF Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov) 

    Revised history
  • Sept/06/2011: Update EC and ED
  • Nov/17/2010: Added some tickets for the records
  • Aug/02/2010: Update directory structure for LOG and Generator files
  • Jun/11/2010: Added approved chains by Lidia
  • May/29/2010: Modify procedures of 'Production chain options'
  • May/27/2010: Several minor fixes
  • May/25/2010: Added 'Production chain options', 'Locations of outputs etc at HPSS'
  • May/24/2010: Update trigger id options
  • May/21/2010: Update instructions for xml file, useful discussions 
 
    Contents
 
    Please send me (CBPowell@lbl.gov) an e-mail if you have any questions/suggestions.

 
The typical xml file for the embedding job submission looks
 
<!-- Generated by StRoot/macros/embedding/get_embedding_xml.pl on Mon Aug  2 15:26:13 PDT 2010 -->
<?xml version="1.0" encoding="utf-8"?>
<job maxFilesPerProcess="1" fileListSyntax="paths">

<command>
<!-- Load library -->
starver SL07e

<!-- Set tags file directory -->
setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id

<!-- Set year and day from filename -->
setenv EMYEAR `StRoot/macros/embedding/getYearDayFromFile.pl -y ${FILEBASENAME}`
setenv EMDAY `StRoot/macros/embedding/getYearDayFromFile.pl -d ${FILEBASENAME}`

<!-- Set log files area -->
setenv EMLOGS /project/projectdirs/star/embedding

<!-- Set HPSS outputs/LOG path -->
setenv EMHPSS /nersc/projects/starofl/embedding/ppProductionJPsi/JPsi_&FSET;_20100601/P06id.SL07e/${EMYEAR}/${EMDAY}

<!-- Print out EMYEAR and EMDAY and EMLOGS -->
echo EMYEAR : $EMYEAR
echo EMDAY  : $EMDAY
echo EMLOGS : $EMLOGS
echo EMHPSS : $EMHPSS

<!-- Start job -->
echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...'

root4star -b &lt;&lt;EOF
  std::vector&lt;Int_t&gt; triggers;
  triggers.push_back(117705);
  triggers.push_back(137705);
  triggers.push_back(117701);
  .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C
  bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt");
  .q
EOF

ls -la .
cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log
cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog


<!-- New command to organize log files -->
mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/

<!-- Archive in HPSS -->
hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"

</command>


<!-- Define locations of log/elog files -->
<stdout URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.log"/>
<stderr URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.elog"/>


<!-- Input daq files -->
<input URL="file:/eliza3/starprod/daq/2006/st*"/>

<!-- csh/list files -->
<Generator>
  <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location>
</Generator>

<!-- Put any locally-compiled stuffs into a sand-box -->
<SandBox installer="ZIP">
  <Package name="Localmakerlibs">
    <File>file:./.sl44_gcc346/</File>
    <File>file:./StRoot/</File>
    <File>file:./pams/</File>
  </Package>
</SandBox>

</job>
 
Below is step by step instructions how to set it up for each request.
You can grab some xml files from other requests and modify manually
or can also create it by yourself with "StRoot/macros/embedding/get_embedding_xml.pl".
The option "-h or --help" will show all available options in "get_embedding_xml.pl". 

 

1. Set up daq/tags files

Please contact POC at PDSF to locate daq/tags files if you don't find any daq/tags files
in the eliza disks at PDSF. The relevant descriptions in the xml file are
<!-- Input daq files --> 
<input URL="file:/eliza3/starprod/daq/2006/st*"/> 
for daq files and
<!-- Set tags file directory -->
setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id
for tags files. 

You can change the locations of daq/tags files in xml by "-daq [daq file path]" and "-tag [tags file path]" options like
> StRoot/macros/embedding/get_embedding_xml.pl -daq /eliza3/starprod/daq/2006  -tag /eliza3/starprod/tags/ppProductionJPsi/P06id
Note1-1: You can put the options in any order, so the command below gives the same result as above
> StRoot/macros/embedding/get_embedding_xml.pl -tag /eliza3/starprod/tags/ppProductionJPsi/P06id -daq /eliza3/starprod/daq/2006
Note1-2: If you have already created an xml file in your current directory,
"get_embedding_xml.pl" won't overwrite the previous xml file. If you want to overwrite it, put "-f" option.
 
 

2. Running job, archive outputs into HPSS

Below is the descriptions to run the job (bfcMixer), save log files, put outputs/logs into HPSS.

<!-- Start job -->
echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...'

root4star -b &lt;&lt;EOF
  std::vector&lt;Int_t&gt; triggers;
  triggers.push_back(117705);
  triggers.push_back(137705);
  triggers.push_back(117701);
  .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C
  bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt");
  .q
EOF

ls -la .
cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log
cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog


<!-- New command to organize log files -->
mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/

<!-- Archive in HPSS -->
hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"

 

 

The default bfcMixer is "StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C".
The bfcMixer can be set by "-mixer [bfcMixer file]"
> StRoot/macros/embedding/get_embedding_xml.pl -mixer StRoot/macros/embedding/bfcMixer_Tpx.C 
NOTE2-1: There are three different bfcMixer macros, bfcMixer_Tpx.C, bfcMixer_TpcSvtSsd.C and bfcMixer_TpcOnly.C.
You need to choose the proper bfcMixer macro depending on the Run;

     <= Run4          : bfcMixer_TpcOnly.C

     Run5 - Run7  : bfcMixer_TpcSvtSsd.C

     >= Run8          : bfcMixer_Tpx.C

 
The library, production tag, trigger setup name and request number can be changed by using the following options;
"-production [production tag]", "-lib [library]", "-r [request number]" and "-trg [trigger setup name]"
> StRoot/macros/embedding/get_embedding_xml.pl -production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi 
The default library, production tag, and trigger setup name are SL08c, P08ic, 2007ProductionMinBias respectively
unless otherwise specified. These will be used for the locations of log files, scripts as well as the path in HPSS like
 
<!-- Load library -->
starver SL07e

...
...

ls -la .
cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log
cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog


<!-- New command to organize log files -->
mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/

<!-- Archive in HPSS -->
hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"

...
...

<!-- csh/list files -->
<Generator>
  <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location>
</Generator>
 
 NOTE2-2: If the directories "/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST"
doesn't exist, "get_embedding_xml.pl" will complain and doesn't generate xml like
    Error: No /project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST exists. Stop. 
    Make sure you've put the correct path for generator file.    

Currently, I didn't implement to automatically create "LOG" and "LIST" directories
in get_embedding_xml.pl. So you have to make those directories manually.
Please don't forget to make those directories group read/writable. Please contact me 
if you are not clear what you should do (Hiroshi).  

 

 NOTE2-3: LOG directory structure has been changed to make them search easier,
 and final log files will be moved to
 mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/
where we have introduced three additional directory "P06id" (production), "JPsi_20100601" (particle name and request id)
and "&FSET" (FSET number, which is determined during the job submission). This directory will be dynamically created 
during the job submission, so you don't need to create it.

 

3. Arguments in the bfcMixer

You can find relevant informations for these numbers in the 
STAR simulation request page, http://drupal.star.bnl.gov/STAR/starsimrequest.
 
Below is an example for the 2006 J/Psi in p + p 200 GeV request
 
 
Below is the details how to set each argument in the bfcMixer according to
the informations in the simrequest page
 

3-1. Particle id and particle name

 
The particle geantid can be found in "pams/sim/gstar/gstar_part.g".
You can see the J/Psi -> e+e- decay (100% B.R.) 
Particle Jpsi code=160 TrkTyp=4 mass=3.096 charge=0 tlife=7.48e-21,
pdg=443 bratio= { 1, } mode= { 203, }
"code=160" is the geantid for J/Psi so you can set the geantid and particle name by
"-geantid [GEANT3 id]" and "-particle [particle name]" 
> StRoot/macros/embedding/get_embedding_xml.pl -geantid 160 -particle JPsi 
Please don't put "/" in the particle name. The particle name is also used for the directory
in HPSS as well as in eliza disk to store the outputs. So "/" would be recognized as 
directory.
 
 

3-2. Particle settings (how to simulate pt distribution)

 
Particle settings is usually "Flat in pt", generating the flat pt distribution of input MC particles,
which corresponds to the default option "FlatPt" in the last argument of bfcMixer.
There are two other options depending on how to simulate the input MC particles

"Strange" : Smear primary vertices in all (x,y,z) directions with vertex errors stored in tags files
"Spectrum" : Generate transverse momentum by pt*exp(-pt/T) shape, where T is inverse slope parameter
                       Default T is 0.3 GeV
 
These options can be changed by "-mode [option]"
> StRoot/macros/embedding/get_embedding_xml.pl -mode Strange
NOTE3-1: The mode option string is case insensitive, so "Strange", "strange",  "STRANGE"
or any combination of letters can be accepted as long as it matches with "strange" string.
 
NOTE3-2: "Strange" option cannot be run in "VFMCE" chain option.
You should disable VFMCE by "-VFMCE" and put the appropriate vertex finder option back to the chain options.
 
NOTE3-3: "Spectrum" option can only be run after SL08f. We don't have any solutions at this moment except for 
doing single MC simulation and embedding, separately.
 
 

3-3. Multiplicity

 
The default multiplicity is 1, i.e. throw 1 input MC particle per event. You can change it by "-mult [multiplicity]"
 > StRoot/macros/embedding/get_embedding_xml.pl -mult 0.05 
if the multiplicity is less than 1, the input MC particle will be generated
by the fraction of multiplicity from real data.
The 0.05 gives 5% of multiplicity per event, so the number of particles 
will be varied event-by-event depending on the multiplicity.
In the current example, we don't need to modify the multiplicity.
 
 

3-4. Primary z-vertex range

 
The minimum and maximum z-vertex cuts can be set by
"-zmin [Minimum z-vertex value]" and "-zmax [Maximum z-vertex value]"
> StRoot/macros/embedding/get_embedding_xml.pl -zmin -30.0 -zmax 30.0
if PA requests some specific z-vertex cut. The default z-vertex cut is |vz| < 200 cm.
In the current J/Psi case, we don't need to put any z-vertex cut.
 
 

3-5. Rapidity and transverse momentum range

 
The rapidity and transverse momentum cuts can be set by
"-ymin [Minimum y value]", "-ymax [Maximum y value]",
"-ptmin [Minimum pt value]" and "-ptmax [Maximum pt value]"
> StRoot/macros/embedding/get_embedding_xml.pl -ymin -1.0 -ymax 1.0  -ptmin 0.0  -ptmax 6.0 
NOTE3-4: The simulation request shows the eta range, but we'll throw in rapidity instead of eta.
 
The default rapidity and pt cuts are |y| < 1.5 and 0 < pt < 10 GeV/c, respectively.
For the current example, we only need to modify maximum pt cut off to 6 GeV/c
 
 

3-6. Trigger id's

 
If PA request trigger id(s), you can put them by using "-trigger [trigger id]"
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705
you can add multiple triggers by adding more "-trigger [trigger id]" like
> StRoot/macros/embedding/get_embedding_xml.pl -trggier 117705 -trigger 137705 -trigger 117001
 
minor NOTE3-5: (This doesn't work, please don't try) I had try to implement the "-trigger" argument like
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705 137705 ...
to accept multiple numbers with single "-trigger" option, but couldn't succeed with the current
perl version at PDSF/RCF, though it worked with newer perl in my Macbook (Hiroshi).
 
 

3-7. Production chain name

 
In order to use the correct chain option in the bfcMixer, you should set the production chain name like
StRoot/macros/embedding/get_embedding_xml.pl -prodname P06idpp 
NOTE3-6: It might not be trivial what kind of chain options are available in the bfxMixer's.
                Please take a look at the bfcMixer to make sure that the relevant chain option has been implemented.
 
Including the all relevant options in get_embedding_xml.pl, the command to produce the J/Psi xml file is
> StRoot/macros/embedding/get_embedding_xml.pl -f -daq /eliza3/starprod/daq/2006 -tag /eliza3/starprod/tags/ppProductionJPsi/P06id \
-production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi -geantid 160 -particle JPsi -ptmax 6.0 -trigger 117705 -trigger 137705 -trigger 117701 \ -prodname P06idpp
 
There a few additional things we need to modify xml file manually to get the final form shown in the beginning
 
 
- First, we also need to set 0 for the 2nd/3rd arguments in the bfcMixer_TpcSvtSsd.C.
Those are flags to turn on SVT (2nd) and SSD (3rd) if they are 1. Since this example is
the request for Run6 p+p, we don't need SVT/SSD and hence must turn them off, i.e. set to be 0.
 
- Second, 100 % J/Psi -> e+e- has already been implemented in the "pams/sim/gstar/gstar_part.g" and
is available in the default STAR library so we actually don't need to check out pams and recompile.
In that case, you also need to remove "pams" in your "Localmakerlibs".
 
- Third, PA requests 2 different triggers in "st_jpsi" stream and one trigger in "st_physics" stream
so the xml files should be prepared for those two different streams with different triggers/inputs.
 
There would be couple of other things we have to do manually, which will depend on each request.
Please let us know your feedback in case you had some special modification on your xml file.
 

Production chain options

  Production chain can be found in http://www.star.bnl.gov/devcgi/dbProdOptionRetrv.pl
  Since the current embedding must use "VFMCE" chain option, other vertex finder chain options have to 
  be excluded. The relevant informations about vertex finder chain options can be found
  
  Below is the current procedure how to set up production chain in the bfcMixer's
  1. ED will set up the proper bfcMixer macro with the correct chain options
  2. EC and POC at PDSF will take a look and give feedback
  3. Ask Lidia for her inputs, and verify the new chain with Lidia and Yuri
  4. Enter all chains into Drupal embedding page for documentation
  5. Commit bfcMixer into CVS
 
  Below is the approved chains implemented in the bfcMixer at this moment (Jun/11/2010)
 
 Chains approvied by Lidia

 

--------------------------------

P07ic CuCu production:    TString prodP07icAuAu("P2005b DbV20070518 MakeEvent ITTF ToF ssddat spt SsdIt SvtIt pmdRaw OGridLeak OShortR OSpaceZ2 KeepSvtHit skip1row VFMCE -VFMinuit -hitfilt");

P08ic AuAu production:    DbV20080418 B2007g ITTF adcOnly IAna KeepSvtHit VFMCE -hitfilt l3onl emcDY2 fpd ftpc trgd ZDCvtx svtIT ssdIT Corr5 -dstout

                                         If spacecharge and gridleak corrections are on average instead of event by event then Corr5-> Corr4, OGridLeak3D, OSpaceZ2.

P08ie dAu production :     DbV20090213 P2008 ITTF OSpaceZ2 OGridLeak3D beamLine,  VFMCE TpcClu -VFMinuit -hitfilt

                                        TString chain20pt("NoInput,PrepEmbed,gen_T,geomT,sim_T,trs,-ittf,-tpc_daq,nodefault);

P06id pp production :       TString prodP06idpp("DbV20060729 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt");

P06ie pp production :       TString prodP06iepp("DbV20060915 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7096005-7156040

                                        TString prodP06iepp("DbV20061021 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7071001-709402

 

 
 
 
 
 
 
 
 

Locations of outputs, log files and back up of relevant codes at HPSS

  Current output as well as log file location at HPSS is determined by the following scheme
/nersc/projects/starofl/embedding/${TRGSETUPNAME}/${PARTICLE}_&FSET;_${REAUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}/${EMDAY}
where "&FSET;" is 3 digit number determined at job submission (e.x. 100),
"${TRGSETUPNAME}" is trigger set up name (e.x. 2007ProductionMinBias),
"${PARTICLE}" is input MC particle name (e.x. JPsi),
"${REQUEST}" is the request id assigned for each embedding request,
"${PRODUCTION}" is the production tag for the real data (e.x. P07id),
"${LIBRARY}" is the library used for embedding simulation which is not 
always matched with that for the real data production (e.x. SL08f),
"${EMYEAR}" and "${EMDAY}" are year and day number
extracted from the input file
 
  The back up locations of relevant source codes, macros, scripts and xml files are
 
 
 (starofl home) /home/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}

(HPSS) /nersc/projects/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}
 
 
 

 

1. VFMCE chain option

Relevant discussions about VFMCE chain option can be found in the following links
 
 
  

2. Eta dip around eta ~ 0

 
Relevant discussions about eta dip problem can be found in

the bottom line from Yuri was
 
The problem was in StTrsMaker for  two bfc options "TrsPileUp" and "TrsToF"
accounting particle time of flight in producing hits.
The above options activated TPC fiducial volume cut which removed a few cm near
membrane.
The cut has been removed and committed.
   Hiroshi has confirmed that with this fix there is no dip at eta ~ 0.
               Yuri
 
 

 3. EmbeddingShortCut chain option

 
The details can be found in the following ticket in
 
The bottom line from Yuri was
 
EmbeddingShortCut means that TpcHitMover and dEdx makers
will not apply corrections for simulated data (IdTruth > 0 && IdTruth
< 10000 && QA > 0.95).
     Trs has to have  it. TrsRS should not have it.
                Yuri
 
and
 
this option has really started to be used since release SL10c.
Till this release this option was always "ON" by default.
The only need for back propagation is when you will use release >= SL10c
with Trs. This correction will be done in dev for nightly tests.
                      Yuri
 

4. Bug in StAssociationMaker

See ticket in
 

 

For EC and ED(s)

 

Embedding instructions for EC and ED(s)

    Last updated on Oct/23/2018 by Xianglei Zhu
 

    Current Embedding Coordinator (EC): Xianglei Zhu (zhux@tsinghua.edu.cn)

    Current Embedding Deputy (ED): Derek Anderson (derekwigwam9@tamu.edu)

    Current NERSC Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov) and Jan Balewski (balewski@lbl.gov)

    Revised history
  • Jun/24/2018: initial version (copied from old instructions, still under construction)
 
    Contents
 
    Please send me (zhux@rcf.rhic.bnl.gov) an e-mail if you have any questions/suggestions.


 
The typical xml file for the embedding job submission looks
 
<!-- Generated by StRoot/macros/embedding/get_embedding_xml.pl on Mon Aug  2 15:26:13 PDT 2010 -->
<?xml version="1.0" encoding="utf-8"?>
<job maxFilesPerProcess="1" fileListSyntax="paths">

<command>
<!-- Load library -->
starver SL07e

<!-- Set tags file directory -->
setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id

<!-- Set year and day from filename -->
setenv EMYEAR `StRoot/macros/embedding/getYearDayFromFile.pl -y ${FILEBASENAME}`
setenv EMDAY `StRoot/macros/embedding/getYearDayFromFile.pl -d ${FILEBASENAME}`

<!-- Set log files area -->
setenv EMLOGS /project/projectdirs/star/embedding

<!-- Set HPSS outputs/LOG path -->
setenv EMHPSS /nersc/projects/starofl/embedding/ppProductionJPsi/JPsi_&FSET;_20100601/P06id.SL07e/${EMYEAR}/${EMDAY}

<!-- Print out EMYEAR and EMDAY and EMLOGS -->
echo EMYEAR : $EMYEAR
echo EMDAY  : $EMDAY
echo EMLOGS : $EMLOGS
echo EMHPSS : $EMHPSS

<!-- Start job -->
echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...'

root4star -b &lt;&lt;EOF
  std::vector&lt;Int_t&gt; triggers;
  triggers.push_back(117705);
  triggers.push_back(137705);
  triggers.push_back(117701);
  .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C
  bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt");
  .q
EOF

ls -la .
cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log
cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog


<!-- New command to organize log files -->
mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/

<!-- Archive in HPSS -->
hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"

</command>


<!-- Define locations of log/elog files -->
<stdout URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.log"/>
<stderr URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.elog"/>


<!-- Input daq files -->
<input URL="file:/eliza3/starprod/daq/2006/st*"/>

<!-- csh/list files -->
<Generator>
  <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location>
</Generator>

<!-- Put any locally-compiled stuffs into a sand-box -->
<SandBox installer="ZIP">
  <Package name="Localmakerlibs">
    <File>file:./.sl44_gcc346/</File>
    <File>file:./StRoot/</File>
    <File>file:./pams/</File>
  </Package>
</SandBox>

</job>
 
Below is step by step instructions how to set it up for each request.
You can grab some xml files from other requests and modify manually 
or can also create it by yourself with "StRoot/macros/embedding/get_embedding_xml.pl".
The option "-h or --help" will show all available options in "get_embedding_xml.pl". 

 

1. Set up daq/tags files

 

Please contact POC at PDSF to locate daq/tags files if you don't find any daq/tags files
in the eliza disks at PDSF. The relevant descriptions in the xml file are
<!-- Input daq files --> 
<input URL="file:/eliza3/starprod/daq/2006/st*"/> 
for daq files and
<!-- Set tags file directory -->
setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id
for tags files. 

You can change the locations of daq/tags files in xml by "-daq [daq file path]" and "-tag [tags file path]" options like
> StRoot/macros/embedding/get_embedding_xml.pl -daq /eliza3/starprod/daq/2006  -tag /eliza3/starprod/tags/ppProductionJPsi/P06id
Note1-1: You can put the options in any order, so the command below gives the same result as above
> StRoot/macros/embedding/get_embedding_xml.pl -tag /eliza3/starprod/tags/ppProductionJPsi/P06id -daq /eliza3/starprod/daq/2006
Note1-2: If you have already created an xml file in your current directory,
"get_embedding_xml.pl" won't overwrite the previous xml file. If you want to overwrite it, put "-f" option.
 
 

2. Running job, archive outputs into HPSS

Below is the descriptions to run the job (bfcMixer), save log files, put outputs/logs into HPSS.

<!-- Start job -->
echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...'

root4star -b &lt;&lt;EOF
  std::vector&lt;Int_t&gt; triggers;
  triggers.push_back(117705);
  triggers.push_back(137705);
  triggers.push_back(117701);
  .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C
  bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt");
  .q
EOF

ls -la .
cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log
cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog


<!-- New command to organize log files -->
mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/

<!-- Archive in HPSS -->
hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"

 

 

The default bfcMixer is "StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C".
The bfcMixer can be set by "-mixer [bfcMixer file]"
> StRoot/macros/embedding/get_embedding_xml.pl -mixer StRoot/macros/embedding/bfcMixer_Tpx.C 
NOTE2-1: There are three different bfcMixer macros, bfcMixer_Tpx.CbfcMixer_TpcSvtSsd.C and bfcMixer_TpcOnly.C.
You need to choose the proper bfcMixer macro depending on the Run;

     <= Run4          : bfcMixer_TpcOnly.C

     Run5 - Run7  : bfcMixer_TpcSvtSsd.C

     >= Run8          : bfcMixer_Tpx.C

 
The library, production tag, trigger setup name and request number can be changed by using the following options;
"-production [production tag]""-lib [library]""-r [request number]" and "-trg [trigger setup name]"
> StRoot/macros/embedding/get_embedding_xml.pl -production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi 
The default library, production tag, and trigger setup name are SL08cP08ic2007ProductionMinBias respectively
unless otherwise specified. These will be used for the locations of log files, scripts as well as the path in HPSS like
 
<!-- Load library -->
starver SL07e

...
...

ls -la .
cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log
cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog


<!-- New command to organize log files -->
mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/

<!-- Archive in HPSS -->
hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"

...
...

<!-- csh/list files -->
<Generator>
  <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location>
</Generator>
 
 NOTE2-2: If the directories "/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST"
doesn't exist, "get_embedding_xml.pl" will complain and doesn't generate xml like
    Error: No /project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST exists. Stop. 
    Make sure you've put the correct path for generator file.    

 

Currently, I didn't implement to automatically create "LOG" and "LIST" directories
in get_embedding_xml.pl. So you have to make those directories manually.
Please don't forget to make those directories group read/writable. Please contact me 
if you are not clear what you should do (Hiroshi).  

 

 NOTE2-3: LOG directory structure has been changed to make them search easier,
 and final log files will be moved to
 mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/
where we have introduced three additional directory "P06id" (production)"JPsi_20100601" (particle name and request id)
and "&FSET" (FSET number, which is determined during the job submission). This directory will be dynamically created 
during the job submission, so you don't need to create it.

 

3. Arguments in the bfcMixer

You can find relevant informations for these numbers in the 
STAR simulation request page, http://drupal.star.bnl.gov/STAR/starsimrequest.
 
Below is an example for the 2006 J/Psi in p + p 200 GeV request
 
 
Below is the details how to set each argument in the bfcMixer according to 
the informations in the simrequest page
 

3-1. Particle id and particle name

 
The particle geantid can be found in "pams/sim/gstar/gstar_part.g".
You can see the J/Psi -> e+e- decay (100% B.R.) 
Particle Jpsi code=160 TrkTyp=4 mass=3.096 charge=0 tlife=7.48e-21,
pdg=443 bratio= { 1, } mode= { 203, }
"code=160" is the geantid for J/Psi so you can set the geantid and particle name by
"-geantid [GEANT3 id]" and "-particle [particle name]" 
> StRoot/macros/embedding/get_embedding_xml.pl -geantid 160 -particle JPsi 
Please don't put "/" in the particle name. The particle name is also used for the directory
in HPSS as well as in eliza disk to store the outputs. So "/" would be recognized as 
directory.
 
 

3-2. Particle settings (how to simulate pt distribution)

 
Particle settings is usually "Flat in pt", generating the flat pt distribution of input MC particles,
which corresponds to the default option "FlatPt" in the last argument of bfcMixer.
There are two other options depending on how to simulate the input MC particles

"Strange" : Smear primary vertices in all (x,y,z) directions with vertex errors stored in tags files
"Spectrum" : Generate transverse momentum by pt*exp(-pt/T) shape, where T is inverse slope parameter
                       Default T is 0.3 GeV
 
These options can be changed by "-mode [option]"
> StRoot/macros/embedding/get_embedding_xml.pl -mode Strange
NOTE3-1: The mode option string is case insensitive, so "Strange""strange",  "STRANGE"
or any combination of letters can be accepted as long as it matches with "strange" string.
 
NOTE3-2: "Strange" option cannot be run in "VFMCE" chain option.
You should disable VFMCE by "-VFMCE" and put the appropriate vertex finder option back to the chain options.
 
NOTE3-3: "Spectrum" option can only be run after SL08f. We don't have any solutions at this moment except for 
doing single MC simulation and embedding, separately.
 
 

3-3. Multiplicity

 
The default multiplicity is 1, i.e. throw 1 input MC particle per event. You can change it by "-mult [multiplicity]"
 > StRoot/macros/embedding/get_embedding_xml.pl -mult 0.05 
if the multiplicity is less than 1, the input MC particle will be generated
by the fraction of multiplicity from real data.
The 0.05 gives 5% of multiplicity per event, so the number of particles 
will be varied event-by-event depending on the multiplicity.
In the current example, we don't need to modify the multiplicity.
 
 

3-4. Primary z-vertex range

 
The minimum and maximum z-vertex cuts can be set by 
"-zmin [Minimum z-vertex value]" and "-zmax [Maximum z-vertex value]"
> StRoot/macros/embedding/get_embedding_xml.pl -zmin -30.0 -zmax 30.0
if PA requests some specific z-vertex cut. The default z-vertex cut is |vz| < 200 cm.
In the current J/Psi case, we don't need to put any z-vertex cut.
 
 

3-5. Rapidity and transverse momentum range

 
The rapidity and transverse momentum cuts can be set by
"-ymin [Minimum y value]""-ymax [Maximum y value]",
"-ptmin [Minimum pt value]" and "-ptmax [Maximum pt value]"
> StRoot/macros/embedding/get_embedding_xml.pl -ymin -1.0 -ymax 1.0  -ptmin 0.0  -ptmax 6.0 
NOTE3-4: The simulation request shows the eta range, but we'll throw in rapidity instead of eta.
 
The default rapidity and pt cuts are |y| < 1.5 and 0 < pt < 10 GeV/c, respectively.
For the current example, we only need to modify maximum pt cut off to 6 GeV/c
 
 

3-6. Trigger id's

 
If PA request trigger id(s), you can put them by using "-trigger [trigger id]"
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705
you can add multiple triggers by adding more "-trigger [trigger id]" like
> StRoot/macros/embedding/get_embedding_xml.pl -trggier 117705 -trigger 137705 -trigger 117001
 
minor NOTE3-5: (This doesn't work, please don't try) I had try to implement the "-trigger" argument like
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705 137705 ...
to accept multiple numbers with single "-trigger" option, but couldn't succeed with the current 
perl version at PDSF/RCF, though it worked with newer perl in my Macbook (Hiroshi).
 
 

3-7. Production chain name

 
In order to use the correct chain option in the bfcMixer, you should set the production chain name like
StRoot/macros/embedding/get_embedding_xml.pl -prodname P06idpp 
NOTE3-6: It might not be trivial what kind of chain options are available in the bfxMixer's.
                Please take a look at the bfcMixer to make sure that the relevant chain option has been implemented.
 
Including the all relevant options in get_embedding_xml.pl, the command to produce the J/Psi xml file is
> StRoot/macros/embedding/get_embedding_xml.pl -f -daq /eliza3/starprod/daq/2006 -tag /eliza3/starprod/tags/ppProductionJPsi/P06id \
-production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi -geantid 160 -particle JPsi -ptmax 6.0 -trigger 117705 -trigger 137705 -trigger 117701 \ -prodname P06idpp
 
There a few additional things we need to modify xml file manually to get the final form shown in the beginning
 
 
- First, we also need to set 0 for the 2nd/3rd arguments in the bfcMixer_TpcSvtSsd.C. 
Those are flags to turn on SVT (2nd) and SSD (3rd) if they are 1. Since this example is 
the request for Run6 p+p, we don't need SVT/SSD and hence must turn them off, i.e. set to be 0.
 
- Second, 100 % J/Psi -> e+e- has already been implemented in the "pams/sim/gstar/gstar_part.g" and 
is available in the default STAR library so we actually don't need to check out pams and recompile. 
In that case, you also need to remove "pams" in your "Localmakerlibs".
 
- Third, PA requests 2 different triggers in "st_jpsi" stream and one trigger in "st_physics" stream
so the xml files should be prepared for those two different streams with different triggers/inputs.
 
There would be couple of other things we have to do manually, which will depend on each request.
Please let us know your feedback in case you had some special modification on your xml file.
 

Production chain options

  Production chain can be found in http://www.star.bnl.gov/devcgi/dbProdOptionRetrv.pl
  Since the current embedding must use "VFMCE" chain option, other vertex finder chain options have to 
  be excluded. The relevant informations about vertex finder chain options can be found 
  
  Below is the current procedure how to set up production chain in the bfcMixer's
  1. ED will set up the proper bfcMixer macro with the correct chain options
  2. EC and POC at PDSF will take a look and give feedback
  3. Ask Lidia for her inputs, and verify the new chain with Lidia and Yuri
  4. Enter all chains into Drupal embedding page for documentation
  5. Commit bfcMixer into CVS
 
  Below is the approved chains implemented in the bfcMixer at this moment (Jun/11/2010)
 
 Chains approvied by Lidia

 

--------------------------------

P07ic CuCu production:    TString prodP07icAuAu("P2005b DbV20070518 MakeEvent ITTF ToF ssddat spt SsdIt SvtIt pmdRaw OGridLeak OShortR OSpaceZ2 KeepSvtHit skip1row VFMCE -VFMinuit -hitfilt");

P08ic AuAu production:    DbV20080418 B2007g ITTF adcOnly IAna KeepSvtHit VFMCE -hitfilt l3onl emcDY2 fpd ftpc trgd ZDCvtx svtIT ssdIT Corr5 -dstout

                                         If spacecharge and gridleak corrections are on average instead of event by event then Corr5-> Corr4, OGridLeak3D, OSpaceZ2.

P08ie dAu production :     DbV20090213 P2008 ITTF OSpaceZ2 OGridLeak3D beamLine,  VFMCE TpcClu -VFMinuit -hitfilt

                                        TString chain20pt("NoInput,PrepEmbed,gen_T,geomT,sim_T,trs,-ittf,-tpc_daq,nodefault);

P06id pp production :       TString prodP06idpp("DbV20060729 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt");

P06ie pp production :       TString prodP06iepp("DbV20060915 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7096005-7156040

                                        TString prodP06iepp("DbV20061021 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7071001-709402

 

 
 
 
 
 
 
 
 

Locations of outputs, log files and back up of relevant codes at HPSS

  Current output as well as log file location at HPSS is determined by the following scheme
/nersc/projects/starofl/embedding/${TRGSETUPNAME}/${PARTICLE}_&FSET;_${REAUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}/${EMDAY}
where "&FSET;" is 3 digit number determined at job submission (e.x. 100),
"${TRGSETUPNAME}" is trigger set up name (e.x. 2007ProductionMinBias), 
"${PARTICLE}" is input MC particle name (e.x. JPsi),
"${REQUEST}" is the request id assigned for each embedding request,
"${PRODUCTION}" is the production tag for the real data (e.x. P07id),
"${LIBRARY}" is the library used for embedding simulation which is not 
always matched with that for the real data production (e.x. SL08f),
"${EMYEAR}" and "${EMDAY}" are year and day number
extracted from the input file
 
  The back up locations of relevant source codes, macros, scripts and xml files are
 
 
 (starofl home) /home/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}

(HPSS) /nersc/projects/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}
 
 
 

 

1. VFMCE chain option

Relevant discussions about VFMCE chain option can be found in the following links
 
 
  

2. Eta dip around eta ~ 0

 
Relevant discussions about eta dip problem can be found in

the bottom line from Yuri was
 
The problem was in StTrsMaker for  two bfc options "TrsPileUp" and "TrsToF"
accounting particle time of flight in producing hits.
The above options activated TPC fiducial volume cut which removed a few cm near
membrane.
The cut has been removed and committed.
   Hiroshi has confirmed that with this fix there is no dip at eta ~ 0.
               Yuri
 
 

 3. EmbeddingShortCut chain option

 
The details can be found in the following ticket in
 
The bottom line from Yuri was
 
EmbeddingShortCut means that TpcHitMover and dEdx makers
will not apply corrections for simulated data (IdTruth > 0 && IdTruth
< 10000 && QA > 0.95).
     Trs has to have  it. TrsRS should not have it.
                Yuri
 
and
 
this option has really started to be used since release SL10c.
Till this release this option was always "ON" by default.
The only need for back propagation is when you will use release >= SL10c
with Trs. This correction will be done in dev for nightly tests.
                      Yuri
 

4. Bug in StAssociationMaker

See ticket in
 

 

For Embedding Helpers (OBSOLETE)

Embedding instructions for Embedding Helpers

  This instructions provide for embedding helpers how to prepare/submit the embedding jobs at PDSF

  You can also find the documentation about the embedding QA in the "Links" section.
 

  NOTE: This is specific instructions at PDSF, some procedures may not work at RCF

 

    Current Embedding Coordinator (EC): Terence Tarnowsky (tarnowsk@nscl.msu.edu)

    Current PDSF Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov)

 
Last updated on Nov/8/2011 by T. Tarnowsky
Revised history (Most recent first)
  • Nov/8/2011: Updated EC and ED information to bring it in line with other webpages.
  • Apr/12/2011: Update Section 3.1, remove obsolete procedure for StPrepEmbedMaker
  • Feb/17/2011: Update Section 3.2
  • Nov/17/2010: Update to get afs token to access afs area
  • Sep/08/2010: Update memory limit in the batch at PDSF (Section 1)
  • Aug/25/2010: Update Section 3.2
  • May/25/2010: Update Section 3 (include klog, library and NOTE about VFMCE)
  • May/21/2010: Added "How to setup your environment at PDSF ?" and "Links"
  • May/20/2010: Updated instructions

 


  Contents


1. How to set up your environment at PDSF ?

Please go to the PDSF-wiki page to set up your environment at PDSF,
and subscribe/ask questions in "PDSF at STAR" hypernews forum (pdsf-hn@www.star.bnl.gov)
in case you have any troubles.
 
Please also don't forget to subscribe "Embedding issues and discussions" hypernews
 
Important note:

Please have a look at the "common issue: memory limit in batch"

and follow the procedure how to increase the memory limit in the batch jobs.

All EH should make this change before submitting any embedding production jobs.

 


 

 


2. Set up daq and tags files at PDSF

Please contact POC at PDSF to locate proper daq and tags files at eliza disks at PDSF.




3. Production area setup

3-0. Get afs token to access CVS
 
Please don't forget this procedure (How to klog to my RHIC afs account ?) before using cvs.

Below is the copy from the link above what you need to do in order to get afs token to access CVS

PDSF does not belong to the rhic.bnl.gov afs cell (the default cell is nersc),
so you have to specify the rhic.bnl.gov cell explicitly.
Additionally, your PDSF username may be different than on RACF.
If so, you need to specify your afs account name explicitly as well.
> klog -cell rhic -principal YourRCFUserName

 

3-1. Check out relevant macros/scripts and Makers from CVS
> cvs co StRoot/macros/embedding
> cvs co StRoot/St_geant_Maker
 
"StRoot/macros/embedding" should contain all relevant macros/scripts for the embedding productions. "StRoot/St_geant_Maker" area has "StPrepEmbedMaker" which is the main code to process the embedding.
 
****** THIS PROCEDURE IS OBSOLETE, SO YOU DON'T NEED TO FOLLOW ANYMORE ******
 

 

NOTE: Under the current environment at PDSF, you will not be able to compile St_geant_Maker by cons. So you need move "StRoot/St_geant_Maker/Embed/StPrepEmbedMaker.*" one directory up, then compile by cons

 

 

> mv StRoot/St_geant_Maker/Embed/StPrepEmbedMaker* StRoot/St_geant_Maker/ > starver ${library}      > cons 
****************************************************************************************************************
 
Please don't forget to set the proper library "${library}" before compiling the source codes.
In following sections, we assume you have already did "starver ${library}".
 
The requested library can be found in simulation request page.
If you are not clear which library should be used for the request, please contact EC or ED.
 
NOTE: Any productions prior to P07ie should be simulated under SL07e due to the lack of "VFMCE"
chain options in older libraries, even if PA's request a specific library (like SL06d).
 
 
3-2. Set up input MC particle
 
The input MC particle will be specified by using the geantid (see GEANT3 manual around page number 61 for pre-defined particle).
Take a look at "$STAR/pams/sim/gstar/gstar_part.g", if you cannot find the geantid for the requested particle
then you need to check out "pams/sim/gstar" from CVS and compile
 
> cvs co pams/sim/gstar 
> cons
 
Note that you need to check out the whole gstar directory in order to properly load gstar library.
The particle you need to simulate must be in either "$STAR/pams/sim/gstar/gstar_part.g" or
local "pams/sim/gstar/gstar_part.g" code. Please also take a look at the details about the description
Please contact EC or ED if you don't find requested particles in the "pams/sim/gstar/gstar_part.g".
 
3-3. Set up bfcMixer macro


Please contact EC or ED whether bfcMixer (either bfcMixer_TpcSvtSsd.C or bfcMixer_Tpx.C) is ready to submit or not, and confirm which bfcMixer should be used for the current request.

 
 
3-4. Set up xml file for job submission
 
Please contact EC or ED to obtain a proper xml file for your job submission.
 
3-5. Other code modifications

For other code modifications needed, please check the libraries installed locally under the star embedding production account (starofl), and contact POC at PDSF for assistance.
 
 

4. Submit jobs


4-1. Local sandbox
 
Local compiled codes/libraries are archived as "Localmakerlibs.zip" and used for each job submission.
Default local sandbox should be

<!-- Put any locally-compiled stuffs into a sand-box -->
<SandBox installer="ZIP">
  <Package name="Localmakerlibs">
    <File>file:./.sl44_gcc346/</File>
    <File>file:./StRoot/</File>
    <File>file:./pams/</File>
  </Package>
</SandBox>


in your xml file. If you have anything other than above codes, please include them.

Please contact EC or ED if you are not clear enough which codes you need to include.

 

 

4-2. Submit jobs by scheduler

 
Once everything is prepared, you can submit jobs like
> star-submit-template -template embed_template.xml -entities FSET=200
where "embed_template.xml" is the xml file, and FSET is the unique 3 digit number
for a given request (typically starting from 100 or 101).

The embedding templates at PDSF are set up to write the output files to HPSS at the end of each job.
When all jobs in one FSET are finished, the outputs need to be retrieved from HPSS. 
Please contact POC at PDSF which eliza disk can be used for your outputs.

The xml file "schedTemplateExp.xml", which will be automatically generated once you submit job by scheduler,
should be kept for each FSET job under "setup" directory.
And please don't submit the same FSET jobs, that will overwrite your previous outputs in HPSS.
 

4-3. Re-submitting jobs

 

Sometime you may need to modify something under "StRoot" or "pams", and recompile to fix some problems.

Each time you recompiled your local codes, you should clean up the current "Localmakerlibs.zip" and

"Localmakerlibs.package/" before starting resubmission. If you forgot to clean up the older "Localmakerlibs",

then the modification in the local codes will not reflect in the resubmitting jobs.

 


5. Back up macros, scripts, xml file and codes

Please contact EC or ED for backing up all relevant macros, scripts and codes in the local starofl area as well as HPSS at PDSF. Please also remind to clean up all relevant LOG and LIST files that have been generated by scheduler after PWG QA is done by PA.
 

 


6. Links

 

Obsolete documentations

Embedding Production Setup

This page describes how to set up embedding production. This procedure needs to be followed for any set of daq files/production version that requires embedding. Since this typically involves hacking the reconstruction chain, it is not advised that the typical STAR PA attempt this step. Please coordinate with a local embedding coordinator, and the overal Embedding coordinator (Olga).
Note: The documentation here is very terse; it will be enriched as the documentation as a whole is iterated on. Patience is appreciated.

Get daq files from RCF.

Grab a set of daq files from RCF which cover the lifetime of the run, the luminosity range experienced, and the conditions for the production.

Rerun standard production but without corrections.

bfc.C macros are located under ~starofl/bfc. Edit the submit.[Production] script to point to the daq files loaded (as above).

Put tags files on disk.

The results of the previous jobs will be .tags.root files located on HPSS. Retrieve the files, set a pointer for the tags files in the Production-specific directory under ~starofl/embedding.


Now you're ready to start production.

Embedding Setup Off-site

Introduction

The purpose of this document is to describe step-by-step the setting up of embedding infrastructure on a remote site i.e. not at it's current home which is PDSF. It is based on the experience of setting up embedding at Birmingham's NP cluster (Bham). I will try to maintain a distinction between steps which are necessary in general and those which were specific to porting things to Bham. It should also be a useful guide for those wanting to run embedding at PDSF and needing to copy the relevant files into a suitable directory structure.

Pre-requisites

Before trying to set up embedding on a remote site you should have:
  • a working local installation of the STAR library in which you are interested (or be satisified with your AFS-based library performance).
  • a working mirror of the star database (or be satisfied with your connection to the BNL hosted db).
If these two things are working correctly you will be able to process a daq file with the usual bfc.C macro. Check that you can do this and do not proceed further if this is not the case as you will be wasting your time. You can find the correct bfc.C options to use with a particular daq file and software release combination here.

Collect scripts

The scripts are currently housed at PDSF in the 'starofl' account area. At the time of writing (and the time at which I set up embedding in Bham) they are not archived in CVS. The suggested way to collect them is to copy them into a directory in your own PDSF home account then tar and export it for installation on your local cluster. The top directory for embedding is /u/starofl/embedding . Under this directory there are several subdirectories of interest.
  • Those named after each production, e.g. P06ib which contain mixer macro and perl scripts
  • Common which contains further subdirectories lists and csh and a submission perl script
  • GSTAR which contains the kumac for running the simulation
Therefore you need to create a replica of this directory tree. From your home directory e.g. /u/user do
mkdir embedding
cd embedding
mkdir Common
mkdir Common/lists
mkdir Common/csh
mkdir GSTAR
mkdir P06ib
mkdir P06ib/setup

Now it needs populating with the relevant files. In the following /u/user/embedding as an example of your new embedding directory in your user home directory.

cd /u/user/embedding
cp /u/starofl/embedding/getVerticesFromTags_v4.C .
cp -R /u/starofl/embedding/P06ib/EmbeddingLib_v4_noFTPC/ P06ib/
cp /u/starofl/embedding/P06ib/Embedding_sge_noFTPC.pl P06ib/
cp /u/starofl/embedding/P06ib/bfcMixer_v4_noFTPC.C P06ib/
cp /u/starofl/embedding/P06ib/submit.starofl.pl P06ib/submit.user.pl
cp /u/starofl/embedding/P06ib/setup/Piminus_101_spectra.setup P06ib/setup/
cp /u/starofl/embedding/GSTAR/phasespace_P06ib_revfullfield.kumac GSTAR/
cp /u/starofl/embedding/GSTAR/phasespace_P06ib_fullfield.kumac GSTAR/
cp /u/starofl/embedding/Common/submit_sge.pl Common/


You now have all the files need to run embedding. There are further links to make but as you are going to export them to your own cluster you need to make the links afterwards.

Alternatively you can run embedding on PDSF from your home directory. There are a number of change to make first though because the various perl scripts have some paths relating to the starofl account inside them.

For those planning to export to a remote site you should tar and/or scp the data. I would recommend tar so that you can have the original package preserved in case something goes wrong. E.g.

tar -cvf embedding.tar embedding/
scp embedding.tar remoteuser@mycluster.blah.blah:/home/remoteuser

Obviously this step is unnecessary if you intend to run from your PDSF account although you may still want to create a tar file so that you can undo any changes which are wrong.

Login to your remote cluster and extract the archive. E.g
cd /home/remoteuser
tar -xvf embedding.tar

Script changes

The most obvious thing you will find are a number of places inside the perl scripts where the path or location for other scripts appears in the code. These must be changed accordingly.

P06ib/Embedding_sge_noFTPC.pl
  1. changes to e.g.
     
  2. changes to e.g.
  3. changes to e.g.
  4. changes to e.g.
P06ib/EmbeddingLib_v4_noFTPC/Process_object.pm
  1. changes to e.g.
  2. changes to e.g.

    This is because the location of tcsh was different and probably will be for you too.
Common/submit_sge.pl
  1. changes to e.g.

    Change relates to parsing the name of the directory with daq files in to extract the 'data vault' and 'magnetic field' which form part of job name and are used by Embedding_sge_noFTPC.pl (This may not make much sense right now and needs the detailed docs on each component. It is actually just a way to pass a file list with the same basename as the job). In the original script the path to the data is something like /dante3/starprod/daq/2005/cuProductionMinBias/FullField whereas on Bham cluster it is /star/data1/daq/2005/cuProductionMinBias/FullField and thus the pattern match in perl has to change in order to extract the same information. If you have a choice then choose your directory names with care!
  2. changes to e.g.

    Change relates to the line printing the job submission shell script that this perl script writes and submits. The first line had to be changed such that it can correctly be identified as a sh script. I am not sure how original can ever have worked?
  3. changes to e.g.

    This line prints part of the job submission script where the options for the job are specified. In SGE the job options can be in the file and not just on the command line. The extra options for Bham relate to our SGE setup. The -q option provides the name of the queue to use, otherwise it uses the default which I did not want in this case. The other extra options are to make the environment and working diretory correct as they were not the default for us. This is very specific to each cluster. If your cluster does not have SGE then I imagine extensive changes to the part writing the job submission script would be necessary. The scripts use the ability of SGE to have job arrays of similar jobs so you would have to emulate that somehow.

No significant changes required for:
  • getVerticesFromTags_v4.C - none
  • GSTAR/phasespace_P06ib_fullfield.kumac, GSTAR/phasespace_P06ib_fullfield.kumac - actually there are changes but they only relate to redefining particle decay modes for (anti-)Ξ and (anti-)Ω to go 100% to charged modes of interest. This is only relevant for strangeness group
  • P06ib/bfcMixer_v4_noFTPC.C - checked carefully that chain3->SetFlags line actually sets the same flags since Andrew and I had to change the same flags e.g. add GeantOut option after I made orginal copy
  • P06ib/EmbeddingLib_v4_noFTPC/Chain_object.pm - none
  • P06ib/EmbeddingLib_v4_noFTPC/EmbeddingUtilities.pm - there are lines where you may have to add the run numbers of the daq files which you are using so that they are recognised as either full field or reversed full field. In this example (Cu+Cu embedding in P06ib) the lines begin
    and
    . This is also something that Andrew and I both changed after I made the original copy.
  • P06ib/submit.user.pl - changes here relate to setup that you want to run and not to the cluster or directory you are using i.e. which setup file to use, what daq directories to use and any pattern match on the file names (usually for testing purposes to avoid filling the cluster with useless jobs) although you probably want to change the
    line!
  • P06ib/setup/Piminus_101_spectra.setup - any changes here relate to the simulation parameters of the job that you want to do and not to the cluster or directory you are using

Create links

A number of links are required. For example in the /u/starofl/embedding/P06ib there are the following links:
  • daq_dir_2005_cuPMBFF -> /dante3/starprod/daq/2005/cuProductionMinBias/FullField
  • daq_dir_2005_cuPMBRFF -> /dante3/starprod/daq/2005/cuProductionMinBias/ReversedFullField
  • daq_dir_2005_cuPMBHTFF -> /eliza5/starprod/daq/2005/cucuProductionHT/FullField/
  • daq_dir_2005_cuPMBHTRFF -> /eliza5/starprod/daq/2005/cucuProductionHT/ReversedFullField
  • tags_dir_cu_2005 -> /dante3/starprod/tags/P06ib/2005
  • tags_dir_cuHT_2005 -> /eliza5/starprod/embedding/tags/P06ib
  • data -> /eliza12/starprod/embedding/data
  • lists ->../Common/lists
  • csh-> ../Common/csh
  • LOG-> ../Common/LOG
You will therefore need similar links to where you store your daq files (and associated tags files) and where you want the output data to go.

That is it! Some things will probably need to be adapted to your circumstances but it should give you a good idea of what to do

Author: Lee Barnby, University of Birmingham (using starembed account)


Modified: A. Rose, Lawrence Berkeley National Laboratory (using starembed account)


Modified Birmingham Files

Upload of modified embedding infrastructure files used on Birmingham NP cluster for Cu+Cu for (anti-)Λ and K0S embedding request.

Production Management

1) Usually embedding jobs are run in "HPSS" mode so the files end up in HPSS (via FTP). To transfer them from HPSS to disk copy the perl script ~starofl/hjort/getEmbed.pl and modify as needed. This script does at least two things that are not possible with, e.g., a command line hsi command: it only gets the files needed (usually the .geant and .event files) and it changes the permissions after the transfers. Note that if you do the transfers shortly after running the jobs the files will probably still be on the HPSS disk cache and transfers will be much fast than getting the files from tapes.

2) To clean up old embedding files make your own copy of ~starofl/hjort/embedAge.pl and use as needed. Note that $accThresh determines the maximum access time in days of files that will not be deleted.

Running Embedding

This page describes how to run embedding jobs once the daq files and tags files are in place (see other page about embedding production setup).

Basics:

Embedding code is located in production specific directories: ~starofl/embedding/P0xxx. The basic job submission template is typically called submit.starofl.pl in that directory.
Jobs are usually run by user starofl but personal accounts with group starprod membership will work, too (but test first as the group starprod write permissions typically are not in place by default).
The script to submit a set of jobs is submit.[user].pl. The script should be modified to submit an embedding set from the the configuration file
~starofl/embedding/[Production]/setup/[Particle]_[set]_[ID].setup
where
[Particle] is the particle type submitted (Piminus for GEANTID=9, as set inside file)
[set] is the file set submitted (more on this later)
[ID] is the embedding request number


Test procedure:

The best way to test a particular job configuration is to run a single job in "DISK" mode (by selecting a specific daq file in your submission). In this mode all of the intermediate files, scripts, logs, etc., are saved on disk. The location will be under the "data" link in the working directory. You can then go and figure out which script failed, hack as necessary and and try to make things work...

Details:


QA Documentation

New embedding Base QA instructions

  Instructions how to run the QA codes, make QA plots (pdf file) for Embedding Helpers
 

    Current Embedding Coordinator (EC): Xianglei Zhu (zhux@tsinghua.edu.cn)

    Current Embedding Deputy (ED): Derek Anderson (derekwigwam9@tamu.edu)

    Current NERSC Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov) and Jan Balewski (balewski@lbl.gov)

 
 
Last updated on Apr/19/2013 by Patrick Huck

 
Revised history
  • Jan/9/2015: Update trigger selection on real data histogram making
  • Apr/19/2013: update responsible persons and add note about Omega mis-labeling issue
  • Apr/12/2011: Update NOTE in Section1
  • Feb/17/2011: Update NOTE in Section1, and for checking out StAssociationMaker
  • Jan/27/2011: Update the Section 3 for some special decay daughters
  • Oct/25/2010: Added minimc reproduction in Section 1 and StAssociationMaker in Section 
  • Jul/23/2010: Update the real data QA
  • May/21/2010: Minor style change, added "contents" and "Links" for "Embedding instructions for Embedding Helpers"
  • May/14/2010: Update rapidity/trigger selections for real data
  • Apr/07/2010: Update the instructions on "drawEmbeddingQA.C"
  • Feb/22/2010: Added how to change the z-vertex cut and about 'isEmbeddingOnly' flag in "drawEmbeddingQA.C" 
  • Jan/29/2010: Update the NOTE for "doEmbeddingQAMaker" (see below)
  • Jan/22/2010: Update the documentation for the latest QA code
  • Sep/21/2009: Add how to plot the QA histograms in the Section 3
  • Sep/18/2009:  Modify codes to "StEmbeddingQA*", and macro to "doEmbeddingQAMaker.C" (Please have a look at the instructions below carefully).
  • Sep/17/2009:  Modify Section 2.1
  • Sep/15/2009:  Add new embedding QA instructions
e-mail the coordinator/deputies or embedding hyper-news if you have any questions and suggestions
 

Contents

IMPORTANT NOTE: 
- Please make sure that you've used the latest codes/macros.
  If "cvs update StRoot/StEmbeddingUtilities" and
  "cvs update StRoot/macros/embedding" return nothing,
  you have the latest codes in your working directory.
 
  If you run the QA by old "doEmbeddingQAMaker.C", you will
  miss some new QA histograms and the "drawEmbeddingQA.C" may crash.
 
  Sometime, I forget to send an e-mail about the update of QA codes,
  Please do "cvs update" and try again if you encounter some unknown issues.
  If this doesn't solve the problem, please contact me (Hiroshi).
 
- Check out StRoot/StEmbeddingUtilities for the embedding QA
 
- QA codes are supposed to work for DEV so they might be crashed in other libraries 
 
- If the MC geantid >= 50, it is most likely the particle that is
  specifically defined for the STAR simulation.
  If this is the case, please check out "StRoot/StarClassLibrary"
  in your local working directory and recompile.  

1. Produce minimc files 

 

NOTE1: From Run9, you don't need to reproduce the minimc files.
Please use the minimc produced together by the embedding productions for the base QA

 

NOTE1-1: Sometimes you might need to reproduce minimc even if the embedding is done for Run9
or above. Please contact production helpers, or deputies if you are not clear enough to reproduce
the minimc files
 
NOTE2: If the embedding production is done (1) by the library prior to SL10j and
(2) for particles with user defined geantid that is larger than 65535/2 = 32767,
the geantid in the minimc will be some negative value or compeletely different from
what you expect. In that case, please check out the latest "StRoot/StMiniMcEvent"
and "StRoot/StMiniMcMaker" from CVS and recompile
 
 
> cvs co StRoot/StMiniMcEvent 
> cvs co StRoot/StMiniMcMaker
> cons
 
 
NOTE3: Prior to SL10j library, StTrack doesn't have "mSeedQuality" so the cons will complain
when you compile StRoot/StMiniMcMaker like

 

.sl53_gcc432/obj/StRoot/StMiniMcMaker/StMiniMcMaker.cxx: In member function 'void StMiniMcMaker::fillRcTrackInfo(StTinyRcTrack*, const StTrack*, const StTrack*, Int_t)':

.sl53_gcc432/obj/StRoot/StMiniMcMaker/StMiniMcMaker.cxx:1622: error: 'const class StTrack' has no member named 'seedQuality'
 
Please comment out the line 1622 and recompile if you see this error.
 
 

1.1 Check out the relevant codes and macros in your working directory (suppose your working directory is ${work})

 > cp /eliza8/rnc/hmasui/embedding/QA/StMiniHijing.C ${work}

   - If the eliza8 is down, you can also copy the macro from the link below
   - The current "StMiniHijing.C" has been slightly modified from the original version
      to obtain a proper minimc filename based on the input geant.root file, see line 159-162 in
"StMiniHijing.C"
159 TString filename = MainFile; 
160 // int fileBeginIndex = filename.Index(filePrefix,0); 
161 // filename.Remove(0,fileBeginIndex); 
162 filename.Remove(0, filename.Last('/')+1); 

   - You don't need to modify the argument "filePrefix".
 
  You also need to check out "StAssociationMaker" and compile them in order to fix the problem
  of all East TPC hits. See details in the following bug report.
 
> cvs co StRoot/StAssociationMaker
> cons 
  NOTE: If you are working for the QA under SL10j (or above), SL10c_emb, or SL10h_emb,
  you don't need to check out StAssociationMaker.
  Below is copy of the latest StAssociationMaker.cxx.
 

 

1.2 Run "StMiniHijing.C"

Suppose your input file is
 
"/eliza9/starprod/embedding/P08ie/dAu/Piplus_201_1233091546/
Piplus_st_physics_adc_9020060_raw_2060010_201/st_physics_adc_9020060_raw_2060010.geant.root"

   Either
> root4star -b -q StMiniHijing.C'(1000, "/eliza9/starprod/embedding/P08ie/dAu/Piplus_201_1233091546/Piplus_st_physics_adc_9020060_raw_2060010_201/st_physics_adc_9020060_raw_2060010.geant.root", "./")' 

   or

> root4star -b
[0] .L StMiniHijing.C
[1] StMiniHijing(1000, "/eliza9/starprod/embedding/P08ie/dAu/Piplus_201_1233091546/Piplus_st_physics_adc_9020060_raw_2060010_201/st_physics_adc_9020060_raw_2060010.geant.root", "./");
 ....
 ....
 ....
 [2].q

   - The 1st argument is maximum number of events.
   - The 2nd argument is your input geant.root file.
   - The 3rd argument is your output directory. For example, if you would like to put the output into
      "./output" directory, you can set the 3rd argument as "./output/".

  
    NOTE: You must separate the outputs in different directories from different groups,
              such as PiPlus_201_1233091546, PiPlus_202_1233091546 etc.
              Since the filename of input geant.root's are usually identical among different groups,
              you will overwrite your minimc.root if you put the minimc outputs under the same directory.

 
 

1.3 Make sure MC geantid is correct

 

  This step is not always necessary but may help to reduce your time if we did something wrong for geantid.
  Before running the QA code, you can easily check whether MC geant id is correct or not. 
  Suppose we have one minimc file, "st_physics_adc_9020060_raw_2060010.minimc.root" in the current directory, then you can do

 

> root4star st_physics_adc_9020060_raw_2060010.minimc.root
[0] StMiniMcTree->Draw("mMcTracks.mGeantId") 
 
or
  
 [0] StMiniMcTree->Scan("mMcTracks.mGeantId") 
 [1].q
 
 
  Please make sure that the geant id is identical with the requested particle.
  If you don't know the geantid for the requested particles, please take a look at the 
  manual of GEANT3, or ask other Embedding Helpers/Deputy/Coordinator. 
  For unstable (decay) particles, we mostly use some special geantid which was 
  defined for the STAR simulation only. In that case, the geantid should be 
  different from the original one in the GEANT manual.
 

 

2. Run QA codes

 2.1 Check out the QA codes from CVS

 

    NOTE: For the macro, please use "doEmbeddingQAMaker.C" instead of "doMiniMcAnalyzer.C".
              doMiniMcAnalyzer.C will be deleted from CVS sometime in a future.

 

   QA macro

   > cvs checkout StRoot/macros/embedding/doEmbeddingQAMaker.C 

 

   QA codes

     > cvs checkout StRoot/StEmbeddingUtilities

 

2.2 Run the QA macro

2.2.1 QA for embedding outputs 

  Suppose you've prepared the minimc.root filelist "minimc.list"
   and the output filename for your QA histograms is
"embedding.root"
 
   NOTE: you don't need to specify your input particle name (or geantid).
             The QA code will automatically find out the input geant id from the minimc files.

 

   Either

   >  root4star -b -q doEmbeddingQAMaker.C'(2008, "P08ie", "minimc.list", "embedding.root")' 

   or

 > root4star -b 
 [0] .L doEmbeddingQAMaker.C 
 [1] doEmbeddingQAMaker(2008, "P08ie", "minimc.list", "embedding.root"); 
 ...
 ...
 ...
 [2] .q

 

   The details of arguments can be found in the "doEmbeddingQAMaker.C"

  

  NOTE:
   - Output file name (4th argument, in this case "embedding.root") will be
     automatically detemined according to the "year", "production"
     and "particle name" if you leave it blank.
 
   The default z-vertex cut is 30cm. If you would like to change the z-vertex cut,
    you can modify the 6th argument like
 
 
> roo4star -b
[0] .L doEmbeddingQAMaker.C
[1] doEmbeddingQAMaker(2008, "P08ie", "minimc.list", "", kTRUE, 60.0); ... ... ... [2] .q


where the 5th argument is the switch to analyze embedding (kTRUE) or real data (kFALSE).

 

 

2.2.2 QA for real data

You also need to run the "doEmbeddingQAMaker.C" to obtain the same histograms from the real data by

 

> roo4star -b
[0] .L doEmbeddingQAMaker.C
[1] doEmbeddingQAMaker(2008, "P08ie", "mudst.list", "", kFALSE); ... ... ... [2] .q
where mudst.list is the file list for muDst.
 
  NOTE: You must use muDst from the original production, not from embedding production.
  Because the muDst's in the embedding production contain both real data tracks and embedding tracks.
  Please contact POC for the location of muDst's from original production at PDSF.
  

2.2.3  Trigger selections and rapidity cuts for real data

  In order to compare the real data with the embedding outputs with the same event and track conditions,
  you may need to apply the primary z-vertex, trigger and rapidity cuts in the real data. Please find the 
  z-vertex cut selection in the Section 2.2.1.
 

    - The trigger id can be also selected by 

  

	
		
		
	
	
		

StEmbeddingQAUtilities::addTriggerIdCut(const UInt_t id)

     StEmbeddingQAUtilities accept multiple trigger id's while current code assumes 1 trigger id per event,

     The trigger id cut only affects the real data, not for the embedding outputs.

 

     - You can also apply rapidity cut by 

  StEmbeddingQAUtilities::setRapidityCut(const Double_t ycut)

     It would be good to have the same rapidity cut in the real data as the embedding production.

     Please have a look at the simulation request page for rapidity cut or ask embedding helpers

     what rapidity cuts they used for the productions.

 


 

3. Draw your QA results, compare with the real data
 

  You can make the QA plots by "drawEmbeddingQA.C" under "StRoot/macros/embedding".

  > cvs checkout StRoot/macros/embedding/drawEmbeddingQA.C

 

  Suppose you have output ROOT files in your current directory: "qa_embedding_2007_P08ic.root",
  and "qa_real_2007_P08ic.root" (the output file format is automatically
  determined like this if you leave the output filename blank in StEmbeddingQA).

 

> root4star -l drawEmbeddingQA.C'("./", "qa_embedding_2007_P08ic.root", "qa_real_2007_P08ic.root", 8, 10.0)' 

 
  First argument is the directory where the output PDF file is printed.
  The default output directory is the current directory.

  The fourth argument is the geantid that you want to look at. Actually, the code will try to find out
  the geantid from the output histograms including the decay daughters so it doesn't matter if you put the wrong geantid.
 
 
  NOTE: I recommend to produce the output ROOT files for QA histograms as described above
            i.e. produce outputs without specifying the output filename in StEmbeddingQA
            Otherwise you have to put the year, production and particle name by had as you'll see below (Hiroshi)
 
 
  The fifth argument is the maximum pt cut off (GeV/c) for all histograms.
  The default pt cut off is 5 GeV/c. If the requested maximum pt is different from 5, then you need to 
  put another value, in this case 10 GeV/c, in this argument.
 

  You can now check the QA histograms from embedding outputs only by

    > root4star -l drawEmbeddingQA.C'("./", "qa_embedding_2007_P08ic.root", "qa_real_2007_P08ic.root", 8, 10.0, kTRUE)' 

  where the last argument 'isEmbeddingOnly' (default is kFALSE) is the switch

  to draw the QA histograms for embedding outputs only if it is true. 


  If you name the output ROOT files by hand, you need to put the year and
  production by yourself since those are used
  for the output figure name and to print those informations in 
  the legend for each QA plot.
  Suppose you have output ROOT files in your current directory: "qa_embedding.root" and
"qa_real.root"

 
    > root4star -l drawEmbeddingQA.C'("./", "qa_embedding.root", "qa_real.root", 2005, "P08ic", 8, 10.0, kFALSE)' 
  where there are additional arguments "2005" and "P08ic" in the 5th and 6th arguments.
  The arguments are basically the same compared to previous example, except for the input filename's.
  NOTE:
    - In this case, you have to put all arguments by hand (no default arguments in the function)
    - You may notice that the order of arguments are slightly different from the example above.
      This should be fine since we define the two functions "drawEmbeddingQA" with the same
      function name but different arguments in "drawEmbeddingQA.C" (you can directly check the macro how it works)
   
  If you are going to do the base QA for the daughter particles produced by one of the daughter from input MC,
  for example "D* -> D0 -> pi K", we have to specify the geantid for D0 in the macro like
 
> root4star -l drawEmbeddingQA.C'("./", "qa_embedding_2007_P08ic.root", "qa_real_2007_P08ic.root", 8, 10, kFALSE, 37)'
  where the last argument (37) is the D0 geantid. The relevant function to set parent geant id can be
found in the drawEmbeddingQA.C

 

  maker->setParentGeantId(parentGeantId) ;
 
NOTE: Omegas are labelled as Ant-Omegas and vice versa. This is a known issue due to
the wrong Geant Ids for those two particles in STAR libaries SL13b and earlier. Future updates
of these libraries might fix it. For now, it has not been considered necessary to update all
embedding libraries to fix this issue. Please take note of this in all presentations regarding
(Anti-)Omega embedding.

 
 

4. Links

 

End of New embedding QA instructions


 

 

 

------------------------------------------------------------------------------------
This document is intended to describe the macros used during the quality assurance(Q/A) studies. This page is being updated today April 19 2009

 
* Macro : scan_embed_mc.C

After knowing the location of the minimc.root files use this macro to generate and output files with extension .root, in which all the histogramas for a basic QA had been filled. New histogramas had been added, ofr instacne a 3D histogram for Dca (pt, eta, dca) will give the distribution of dca as a function of pt and eta simultaneously. Same is done for the number of fit points (pr, eta, nfit). Also histograms to evaluate the randomness of the embedding  files had been added to this macro.

 

* Macros: scan_embed_mudst.C

This macro hopefully you won't have to use it unless is requested. This macro is meant to generate and output  root file with distributions coming from the MuDst (MuDst from Lidia) for a particular production. You will need just the location of the output file.


* Macro : plot_embed.C


This macro will take both outputs ( the one coming from minimc and that one coming from MuDst) and plot all the basic qa distributions for a particular production.

Hit Level

At the hit level, this is the documentation:

* StEmbedHitsMaker.C

* doEmbedHits.C

* ScanHits.C

* PlotHits.C

See below the links for these macros

Scan-Hits.C

Plot_Hits.C

Plot_Dca.C

Plot_Nfit.C

Plot_embed

<code>

//First run scan_embed.C to generate root file with all the histograms
// V. May 31 2007 - Cristina

#ifndef __CINT__
#include "TROOT.h"
#include "TSystem.h"
#include <iostream.h>
#include "TH1.h"
#include "TH2.h"
#include "TH3.h"
#include "TFile.h"
#include "TTree.h"
#include "TChain.h"
#include "TTreeHelper.h"
#include "TText.h"
#include "TLatex.h"
#include "TAttLine.h"
#include "TCanvas.h"
#endif

void plot_embed(Int_t id=9) {

  gROOT->LoadMacro("~/macros/Utility.C"); //location of Utility.C


gStyle->SetOptStat(1);
//gStyle->SetOptTitle(0);   
gStyle->SetOptDate(0);
gStyle->SetOptFit(0);
gStyle->SetPalette(1);

 float mass2;

    if (id == 8)  { TString tag = "Piplus";  mass2 = 0.019;}
    if (id == 9)  { TString tag = "Piminus"; mass2 = 0.019;}
    if (id == 11) { TString tag = "Kplus";   mass2 = 0.245;}
    if (id == 12) { TString tag = "Kminus";  mass2 = 0.245;}
    if (id == 14) { TString tag = "Proton";  mass2 = 0.880;}
    if (id == 15) { TString tag = "Pbar";    mass2 = 0.880;}
    if (id == 50) { TString tag = "Phi";     mass2 = 1.020;}
    if (id == 2)  { TString tag = "Eplus";   mass2 = 0.511;}
    if (id == 1)  { TString tag = "Dmeson";  mass2 = 1.864;}


char  text1[80];
sprintf(text1,"P05_CuCu200_01_02_08");//this is going to show in all the histograms
char title[100],
char gif[100];
TString prod = "P05_CuCu200_01_02_08";

 
int nch1 = 0;
int nch2 = 1000;

/////////////////////////////////////////////////
//Cloning Histograms
/////////////////////////////////////////////////


 f1 = new TFile ("~/data/P05_CuCu200_010208.root");

        TH3D *hDca1   = (TH3D*)hDca   -> Clone("hDca1");//DCA
    TH3D *hNfit1  = (TH3D*)hNfit  -> Clone("hNFit1");//Nfit
    TH2D *hPtM_E1 = (TH2D*)hPtM_E -> Clone("hPtM_E1");//Energy Loss


    TH2D *dedx1  = (TH2D*)dedx  -> Clone("dedx1");
    TH2D *dedxG1 = (TH2D*)dedxG -> Clone("dedxG1");

    TH2D *vxy1 = (TH2D*)vxy  -> Clone("vxy1");
    TH1D *vz1  = (TH1D*)vz   -> Clone("vz1");
    TH1D *dvx1 = (TH1D*)dvx  -> Clone("dvx1");
    TH1D *dvy1 = (TH1D*)dvy  -> Clone("dvy1");
    TH1D *dvz1 = (TH1D*)dvz  -> Clone("dvz1");


    TH1D *PhiMc1  = (TH1D*)PhiMc  -> Clone("PhiMc1");
    TH1D *EtaMc1  = (TH1D*)EtaMc  -> Clone("EtaMc1");
    TH1D *PtMc1   = (TH1D*)PtMc   -> Clone("PtMc1");

    TH1D *PhiM1   = (TH1D*)PhiM   -> Clone("PhiM1");
    TH1D *EtaM1   = (TH1D*)EtaM   -> Clone("EtaM1");
    TH1D *PtM1    = (TH1D*)PtM    -> Clone("PtM1");

    TH2D *PtM_eff1 = (TH2D*)hPtM_eff ->Clone("PtM_eff1");//efficiency

//if you have MuDst hist

        TH3D *hDca1r   = (TH3D*)hDcaR   -> Clone("hDca1r");
        TH3D *hNfit1r  = (TH3D*)hNfitR  -> Clone("hNFit1r");
        TH2D *dedx1R = (TH2D*)dedxR  -> Clone("dedx1R");

/*
//use the following if you need to compare

    f2 = new TFile ("~/data/test_P07ib_pi_10percent_10_03_07.root");

    TH2D *PtM_eff2 = (TH2D*)hPtM_eff ->Clone("PtM_eff2");
    TH3D *hNfit2  = (TH3D*)hNfit  -> Clone("hNFit2");//Nfit
    TH1D *PtM2    = (TH1D*)PtM    -> Clone("PtM2");
    TH1D *PtMc2    = (TH1D*)PtMc  -> Clone("PtMc2");
   
   
    f3 = new TFile ("~/data/test_P07ib_pi_10percent_10_12_07.root");

    TH3D *hNfit3  = (TH3D*)hNfit  -> Clone("hNFit3");//Nfit
    TH2D *PtM_eff3 = (TH2D*)hPtM_eff ->Clone("PtM_eff3");
    TH1D *PtM3    = (TH1D*)PtM    -> Clone("PtM3");
    TH1D *PtMc3    = (TH1D*)PtMc    -> Clone("PtMc3");



*/


    int nch1 = 0;
    int nch2 = 1000;
    Double_t pt[4]= {0.3,0.4, 0.5, 0.6};

    ////////////////////////////////////////////////////////////
    //efficiency
    /////////////////////////////////////////////////////////////
    /*
    TCanvas *c10= new TCanvas("c10","Efficiency",500, 500);
   
    c10->SetGridx(0);
    c10->SetGridy(0);
    c10->SetLeftMargin(0.15);
    c10->SetRightMargin(0.05);
    //c10->SetTitleOffSet(0.1, "Y");

    c10->cd;

    PtM1->Rebin(2);
    PtMc1->Rebin(2);
   
    PtM1->Divide(PtMc1);
    PtM1->SetLineColor(1);
    PtM1->SetMarkerStyle(23);
    PtM1->SetMarkerColor(1);
    PtM1->Draw();
    PtM1->SetXTitle ("pT (GeV/c)");
    PtM1->SetAxisRange(0.0, 6.0, "X");
   
    return;
   
    /*
    PtM2->Rebin(2);
    PtMc2->Rebin(2);
   
    PtM2->Divide(PtMc2);
    PtM2->SetLineColor(9);
    PtM2->SetMarkerStyle(21);
    PtM2->SetMarkerColor(9);
    PtM2->Draw("same");
   
    PtM3->Rebin(2);
    PtMc3->Rebin(2);
   
    PtM3->Divide(PtMc2);
    PtM3->SetLineColor(2);
    PtM3->SetMarkerStyle(22);
    PtM3->SetMarkerColor(2);
    PtM3->Draw("same");


     keySymbol(0.08, 1.0,  text1, 1, 23, 0.04);

   

    /////////////////////////////////////////////////////////////
    //Vertex position
    //////////////////////////////////////////////////////////
   
   
    TCanvas *c6= new TCanvas("c6","Vertex position",600, 400);
    c6->Divide(2,1);

    c6_1->cd();
    vz1->Rebin(2);
    vz1->SetXTitle("Vertex Z");
    vz1->Draw();
   
    c6_2->cd();
    vxy1->Draw("colz");
    vxy1->SetAxisRange(-1.5, 1.5, "X");
    vxy1->SetAxisRange(-1.5, 1.5, "Y");

   
    vxy1->SetXTitle ("vertex X");
    vxy1->SetYTitle ("vertex Y");
    keyLine(.2, 1.05,text1,1);
   

    c6->Update();


   
    /////////////////////////////////////////////////////////////////////
    //Dedx
    ////////////////////////////////////////////////////////////////////

   
    TCanvas *c8= new TCanvas("c8","dEdx vs P",500, 500);
   
     c8->SetGridx(0);
    c8->SetGridy(0);
    c8->SetLeftMargin(0.15);
    c8->SetRightMargin(0.05);
    c8->cd;

              
    dedxG1->SetXTitle("Momentum P (GeV/c)");
    dedxG1->SetYTitle("dE/dx");
    dedxG1->SetAxisRange(0, 5., "X");
    dedxG1->SetAxisRange(0, 8., "Y");
    dedxG1->SetMarkerColor(1);
    dedxG1->Draw();//"colz");

    dedx1->SetMarkerStyle(7);
    dedx1->SetMarkerSize(0.3);
    dedx1->SetMarkerColor(2);
    dedx1->Draw("same");
   

    keyLine(.3, 0.87,"Embedded Tracks",2);
    keyLine(.3, 0.82,"Ghost Tracks",1);
    keyLine(.2, 1.05,text1,1);


    c8->Update();

   
    /////////////////////////////////////////////////////
    //MIPS (just for pions)
    /////////////////////////////////////////////////////


      if (id==8 || id==9)
      {

      TCanvas *c9= new TCanvas("c9","MIPS",500, 500);
   
      c9->SetGridx(0);
      c9->SetGridy(0);
      c9->SetLeftMargin(0.15);
      c9->SetRightMargin(0.05);
      c9->cd;

    double pt1 = 0.4;
    double pt2 = 0.6;

    dedxG1 -> ProjectionX("rpx");
   
    int blG = rpx->FindBin(pt1);
    int bhG = rpx->FindBin(pt2);

    cout<<blG<<endl;
    cout<<bhG<<endl;

    dedxG1->ProjectionY("rpy",blG,bhG);
    rpy->SetTitle("MIPS");
    rpy->SetMarkerStyle(22);
    //   rpy->SetMarkerColor(2);
    rpy->SetAxisRange(1.3, 4, "X");
   
    //dedxG1->Draw();

    dedx1->ProjectionX("mpx");

    int blm = mpx->FindBin(pt1);
    int bhm = mpx->FindBin(pt2);

    cout<<blm<<endl;
    cout<<bhm<<endl;

    dedx1->ProjectionY("mpy", blm,bhm);
   
    mpy->SetAxisRange(0.5, 6, "X");
    mpy->SetMarkerStyle(22);
    mpy->SetMarkerColor(2);

    float max_rpy = rpy->GetMaximum();
           max_rpy /= 1.*mpy->GetMaximum();
    mpy->Scale(max_rpy);

    cout<<"max_rpy is: "<<max_rpy<<endl;
    cout<<"mpy is: "<<mpy<<endl;

    rpy->Sumw2();
    mpy->Sumw2();

    rpy->Fit("gaus","","",1,4);
    mpy->Fit("gaus","","", 1, 4);
    mpy->GetFunction("gaus")->SetLineColor(2);
   
    rpy->SetAxisRange(0.5 ,6.0, "X");
    mpy->Draw();
    rpy->Draw("same");

    float mipMc = mpy->GetFunction("gaus")->GetParameter(1);//mean
    float mipGhost  = rpy->GetFunction("gaus")->GetParameter(1);
   
     float sigmaMc = mpy->GetFunction("gaus")->GetParameter(2);//mean
        float sigmaGhost  = rpy->GetFunction("gaus")->GetParameter(2);

    char  label1[80];
    char  label2[80];
    char  label3[80];
        char  label4[80];


    sprintf(label1,"mip MC %.3f",mipMc);
    sprintf(label2,"mip Ghost %.3f",mipGhost);
    sprintf(label3,"sigma MC %.3f",sigmaMc);
        sprintf(label4,"sigma Ghost %.3f",sigmaGhost);
   
    keySymbol(.5, .9, label1,2,1);
        keySymbol(.5, .85, label3,2,1);

        keySymbol(.5, .75, label2,1,1);
        keySymbol(.5, .70, label4,1,1);

    keyLine(.2, 1.05,text1,1);
   
    char name[30];
        sprintf(name,"%.2f GeV/c < Pt < %.2f GeV/c",pt1, pt2);
        keySymbol(0.3, 0.65, name, 1, 1, 0.04);

       
    c9->Update();

    }//close if pion

       
    /////////////////////////////////////////////////////////////////////////
    //Energy loss
    //////////////////////////////////////////////////////////////////////////

    TCanvas *c7= new TCanvas("c7","Energy Loss",400, 400);
   
    c7->SetGridx(0);
    c7->SetGridy(0);
    c7->SetLeftMargin(0.20);
    c7->SetRightMargin(0.05);
    c7->cd;

    hPtM_E->ProfileX("pfx");
    pfx->SetAxisRange(-0.01, 0.01, "Y");
    pfx->SetAxisRange(0, 6, "X");
    pfx->GetYaxis()->SetDecimals();
    pfx->SetMarkerStyle(23);
    pfx->SetMarkerSize(0.038);
    pfx->SetMarkerColor(4);
    pfx->SetLineColor(4);
    pfx->SetXTitle ("Pt-Reco");
    pfx->SetYTitle ("ptM - PtMc");
    pfx->SetTitleOffset(2,"Y");

    pfx->Draw();

    /*hPtM_E1->ProfileX("pfx1");
    pfx1->SetAxisRange(-0.007, 0.007, "Y");
    pfx1->GetYaxis()->SetDecimals();
    pfx1->SetLineColor(2);
    pfx1->SetMarkerStyle(21);
    pfx1->SetMarkerSize(0.035);
    pfx1->SetXTitle ("Pt-Reco");
    pfx1->SetYTitle ("ptM - PtMc");
    pfx1->SetTitleOffset(2,"Y");

    pfx1->Draw("same");
   
    c7->Update();

    //////////////////////////////////////////////////////
    //pt
    //////////////////////////////////////////////////////

    TCanvas *c2= new TCanvas("c2","pt",500, 500);
   
    c2->SetGridx(0);
    c2->SetGridy(0);
    c2->SetTitle(0);

    c2->cd();


    //embedded

     PtMc1->Rebin(2);
     PtMc1->SetLineColor(2);
     PtMc1->SetMarkerStyle(20);
     PtMc1->SetMarkerColor(2);
     PtMc1->Draw();
     
     PtMc1->SetXTitle ("pT (GeV/c)");
     PtMc1->SetAxisRange(0.0, 6.0, "X");
 
     //Reco

     PtM1->Rebin(2);
     PtM1->SetMarkerStyle(20);
     PtM1->SetMarkerColor(1);
     PtM1->Draw("same");
     
   
     keySymbol(.2, 1.05,text1,1);
     keyLine(.3, 0.20,"Embeded-McTracks",2);
     keyLine(.3, 0.16,"Matched Pairs",1);
     // keyLine(.4, 0.82,"Previous Embedding",4);

     c2->Update();
     
   
     //////////////////////////////////////////////////////////////////
     //phi
     /////////////////////////////////////////////////////////////////
   
   
    TCanvas *c5= new TCanvas("c5","pt",500, 500);
    //c5->Divide(2,1);
    c5->SetGridx(0);
    c5->SetGridy(0);
    c5->SetTitle(0);

    c5->cd();


    //embedded

     PhiMc1->Rebin(2);
     PhiMc1->SetLineColor(2);
     PhiMc1->SetMarkerStyle(20);
     PhiMc1->SetMarkerColor(2);
     PhiMc1->Draw();
     PhiMc1->SetXTitle ("Phi");
     PhiMc1->SetAxisRange(-4, 4.0, "X");
 
     //Reco

     PhiM1->Rebin(2);
     PhiM1->SetMarkerStyle(20);
     PhiM1->SetMarkerColor(1);
     PhiM1->Draw("same");
     
     //Previous

     // PhiM ->Rebin(2);
     //  PtM->SetLineColor(4);
     // PtM->SetMarkerColor(4);
     //PtM->Draw("same");
     
     TLatex l;
     l.DrawLatex(7.0, 450.0, prod);

     keySymbol(.2, 1.05,text1,1);
     keyLine(.3, 0.20,"Embeded-McTracks",2);
     keyLine(.3, 0.16,"Reco - Matched Pairs",1);
     // keyLine(.4, 0.82,"Previous Embedding",4);

     c2->Update();
   
     c5->Update();

     /////////////////////////////////////
     //eta
     ///////////////////////////////////////////////////////////////

    TCanvas *c2= new TCanvas("c2","Eta",500, 500);
   
    c2->SetGridx(0);
    c2->SetGridy(0);
    c2->SetTitle(0);

    c2->cd();


    //embedded

     EtaMc1->Rebin(2);
     EtaMc1->SetLineColor(2);
     EtaMc1->SetMarkerStyle(20);
     EtaMc1->SetMarkerColor(2);
     EtaMc1->Draw();
     EtaMc1->SetXTitle ("Eta");
     EtaMc1->SetAxisRange(-1.2, 1.2, "X");
 
     //Reco

     EtaM1->Rebin(2);
     EtaM1->SetMarkerStyle(20);
     EtaM1->SetMarkerColor(1);
     EtaM1->Draw("same");
     
   
     TLatex l;
     l.DrawLatex(7.0, 450.0, prod);

     keySymbol(.2, 1.05,text1,1);
     keyLine(.3, 0.20,"Embeded-McTracks",2);
     keyLine(.3, 0.16,"Reco - Matched Pairs",1);
     // keyLine(.4, 0.82,"Previous Embedding",4);

     c2->Update();

   
     /////////////////////////////////////////////////////////////
     //DCA
     ////////////////////////////////////////////////////////////


    TCanvas *c= new TCanvas("c","DCA",800, 400);
    c->Divide(3,1);
    c->SetGridx(0);
    c->SetGridy(0);

    //Matched  (Bins for Multiplicity)
   
    TH1D *hX1 =  (TH1D*)hDca1->Project3D("X");

    Int_t bin_nch1 = hX1->FindBin(nch1);
    Int_t bin_nch2 = hX1->FindBin(nch2);//this should be the same for both graphs (for 3 graphs)

    //Bins for Pt
   
    TString name1 = "hDca1";
           TString namer1 = "hDcar1";

    TString name = "hDca";


    TH1D *hY1 =  (TH1D*)hDca1->Project3D("Y");
    TH1D *hY1_r = (TH1D*)hDca1r->Project3D("Y");

    TH1D *hY = (TH1D*)hDca->Project3D("Y");
   
    Double_t Sum1_MC;
    Double_t Sum1_Real;

    Double_t Sum_MC;

    for(Int_t i=0; i<3 ; i++)
   
      {
        c->cd(i+1);
        Int_t  bin_ptl_1 = hY1->FindBin(pt[i]);
        Int_t  bin_pth_1 = hY1->FindBin(pt[i+1]);


        TH1D *hDcaNew1= (TH1D*)hDca1->ProjectionZ(name1+i,bin_nch1, bin_nch2, bin_ptl_1, bin_pth_1);
        Sum1_MC =  hDcaNew1 ->GetSum();
        cout<<Sum1_MC<<endl;
     
        hDcaNew1->Scale(1./Sum1_MC);
        hDcaNew1 ->SetLineColor(2);
        hDcaNew1->Draw();
       
        hDcaNew1->SetXTitle("Dca (cm)");
             
        sprintf(title," %.2f GeV < pT < %.2f GeV, %d < nch < %d", pt[i],pt[i+1],nch1,nch2);
        hDcaNew1->SetTitle(title);
       
        //----Now MuDSt
       
        Int_t  bin_ptrl_1r = hY1_r->FindBin(pt[i]);
        Int_t  bin_ptrh_1r = hY1_r->FindBin(pt[i+1]);

        TH1D *hDca_r1= (TH1D*)hDca1r->ProjectionZ(namer1+i,bin_nch1, bin_nch2, bin_ptrl_1r, bin_ptrh_1r);
        Sum1_Real =  hDca_r1 ->GetSum();
        cout<<Sum1_Real<<endl;
        hDca_r1->Scale(1./Sum1_Real);
        hDca_r1->Draw("same");

        //Now Previous Embedding

        Int_t  bin_ptl = hY->FindBin(pt[i]);
        Int_t  bin_pth = hY->FindBin(pt[i+1]);


         TH1D *hDcaNew = (TH1D*)hDca->ProjectionZ(name+i,bin_nch1, bin_nch2, bin_ptl, bin_pth);
        Sum_MC =  hDcaNew ->GetSum();
        cout<<Sum_MC<<endl;
     
        hDcaNew->Scale(1./Sum_MC);
        hDcaNew ->SetLineColor(4);
        hDcaNew->Draw("same");
       
        keySymbol(.4, .95,text1,1,1);
        keyLine(0.4, 0.90,"MC- Matched Pairs",2);
        keyLine(0.4, 0.85,"MuDst",1);
        keyLine(0.4, 0.80,"Previous Embedding P06ib",4);
      }


    c->Update();
   

    ///////////////////////////////////////////////////
    //NFIT
    ////////////////////////////////////////////////////

    TCanvas *c1= new TCanvas("c1","NFIT",800, 400);
    c1->Divide(3,1);
    c1->SetGridx(0);
    c1->SetGridy(0);

    //Bins for Multiplicity -Matched tracks

    TH1D *hX1 =  (TH1D*)hNfit1->Project3D("X");

    Int_t bin_nch1 = hX1->FindBin(nch1);
    Int_t bin_nch2 = hX1->FindBin(nch2);//this should be the same for both graphs (for 3 graphs)

    //Bins for Pt
   
    TString name_nfit1 = "hNfit1";
           TString name_nfitr1 = "hNfitr1";

    TString name_nfit = "hNfit";


    TH1D *hY1 =  (TH1D*)hNfit1->Project3D("Y");
    TH1D *hY1_r = (TH1D*)hNfit1r->Project3D("Y");

    TH1D *hY = (TH1D*)hNfit->Project3D("Y");
   
    Double_t Sum1_Nfit_MC;
    Double_t Sum1_Nfit_Real;

    Double_t Sum__Nfit_MC;

    for(Int_t i=0; i<3 ; i++)
   
      {
        c1->cd(i+1);
        Int_t  bin_ptl_1 = hY1->FindBin(pt[i]);
        Int_t  bin_pth_1 = hY1->FindBin(pt[i+1]);


        TH1D *hNfitNew1= (TH1D*)hNfit1->ProjectionZ(name_nfit1+i,bin_nch1, bin_nch2, bin_ptl_1, bin_pth_1);
        Sum1_Nfit_MC =  hNfitNew1 ->GetSum();
        cout<<Sum1_Nfit_MC<<endl;
     
        hNfitNew1->Scale(1./Sum1_Nfit_MC);
        hNfitNew1 ->SetLineColor(2);
        hNfitNew1->Draw();
       
        hNfitNew1->SetXTitle("Nfit");
             
        sprintf(title," %.2f GeV < pT < %.2f GeV, %d < nch < %d", pt[i],pt[i+1],nch1,nch2);
        hNfitNew1->SetTitle(title);
       
        //----Now MuDSt
       
        Int_t  bin_ptrl_1r = hY1_r->FindBin(pt[i]);
        Int_t  bin_ptrh_1r = hY1_r->FindBin(pt[i+1]);

        TH1D *hNfit_r1= (TH1D*)hNfit1r->ProjectionZ(name_nfitr1+i,bin_nch1, bin_nch2, bin_ptrl_1r, bin_ptrh_1r);
        Sum1_Nfit_Real =  hNfit_r1 ->GetSum();
        cout<<Sum1_Nfit_Real<<endl;
        hNfit_r1->Scale(1./Sum1_Nfit_Real);
        hNfit_r1->Draw("same");

        //Now Previous Embedding

        Int_t  bin_ptl = hY->FindBin(pt[i]);
        Int_t  bin_pth = hY->FindBin(pt[i+1]);


         TH1D *hNfitNew = (TH1D*)hNfit->ProjectionZ(name_nfit+i,bin_nch1, bin_nch2, bin_ptl, bin_pth);
        Sum_Nfit_MC =  hNfitNew ->GetSum();
        cout<<Sum__Nfit_MC<<endl;
     
        hNfitNew->Scale(1./Sum_Nfit_MC);
        hNfitNew ->SetLineColor(4);
        hNfitNew->Draw("same");

        ///*T = new TBox(40, 0, 50, 0.01);
        //T->SetLineColor(2);
        //T->SetLineWidth(2);
        //T->Draw("same");       
        //check this....

        keySymbol(.2, .95,text1,1,1);
        keyLine(0.2, 0.90,"MC- Matched Pairs",2);
        keyLine(0.2, 0.85,"MuDst",1);
        keyLine(0.2, 0.80,"Previous Embedding P06ib",4);
      }
     
return;     
}

</code>

scan_embed.C

Notes

Overview



The document is intended as a forum for embedding group members to share information and notes.

Jobs Done

P06ib:
6050022_1050001.28606: /eliza5/starprod/embedding/P06ib
6050022_1050001.7247: /eliza5/starprod/embedding/P06ib
6050022_1060001.3717: /eliza5/starprod/embedding/P06ib
 6050022_1070001.10533: /eliza5/starprod/embedding/P06ib
6050022_1070001.15557: /eliza5/starprod/embedding/P06ib
6050022_1080001.26181: /eliza5/starprod/embedding/P06ib
6050022_1080002.25705: /eliza5/starprod/embedding/P06ib
6050022_1080002.6089: /eliza5/starprod/embedding/P06ib
6050022_2050001.28441: /eliza5/starprod/embedding/P06ib
6050022_2060001.20921: /eliza5/starprod/embedding/P06ib
6050022_2060001.2233: /eliza5/starprod/embedding/P06ib
6050022_2060002.29805: /eliza5/starprod/embedding/P06ib
6050022_2060002.31396: /eliza5/starprod/embedding/P06ib
6050022_2070001.1384: /eliza5/starprod/embedding/P06ib
6050022_2070001.27931: /eliza5/starprod/embedding/P06ib
6050022_3050001.26648: /eliza5/starprod/embedding/P06ib
6050022_3050002.20435: /eliza5/starprod/embedding/P06ib
6050022_3060001.32622: /eliza5/starprod/embedding/P06ib
6050022_3060001.6911: /eliza5/starprod/embedding/P06ib
 6050022_3060002.10366: /eliza5/starprod/embedding/P06ib
 6050022_3060002.5583: /eliza5/starprod/embedding/P06ib
6052072_1050001.15271: /eliza5/starprod/embedding/P06ib
6052072_1060001.18062: /eliza5/starprod/embedding/P06ib
6052072_1080001.20338: /eliza5/starprod/embedding/P06ib
6052072_1080002.27976: /eliza5/starprod/embedding/P06ib
6052072_2070001.27969: /eliza5/starprod/embedding/P06ib
6052072_2070002.21296: /eliza5/starprod/embedding/P06ib
6052072_2080001.17830: /eliza5/starprod/embedding/P06ib
6052072_3070001.15967: /eliza5/starprod/embedding/P06ib

AOmega_111_strange: /eliza5/starprod/embedding/P06ib
AOmega_112_strange: /eliza5/starprod/embedding/P06ib
AOmega_121_strange: /eliza5/starprod/embedding/P06ib
AOmega_122_strange: /eliza5/starprod/embedding/P06ib
AOmega_123_strange: /eliza5/starprod/embedding/P06ib
AXi_101_strange: /eliza5/starprod/embedding/P06ib
AXi_102_strange: /eliza5/starprod/embedding/P06ib
AXi_111_strange: /eliza5/starprod/embedding/P06ib
AXi_112_strange: /eliza5/starprod/embedding/P06ib
AXi_113_strange: /eliza5/starprod/embedding/P06ib
E_101_1154003879: /eliza5/starprod/embedding/P06ib
 E_102_1154003879: /eliza5/starprod/embedding/P06ib
E_106_1154003879: /eliza5/starprod/embedding/P06ib
E_107_1154003879: /eliza5/starprod/embedding/P06ib
E_201_1154003879: /eliza5/starprod/embedding/P06ib
E_202_1154003879: /eliza5/starprod/embedding/P06ib
E_203_1154003879: /eliza5/starprod/embedding/P06ib
E_204_1154003879: /eliza5/starprod/embedding/P06ib
E_205_1154003879: /eliza5/starprod/embedding/P06ib
E_206_1154003879: /eliza5/starprod/embedding/P06ib
E_207_1154003879: /eliza5/starprod/embedding/P06ib
Eplus_502_1154003879: /eliza5/starprod/embedding/P06ib
Lambda_201_strange: /eliza5/starprod/embedding/P06ib
Omega_111_strange: /eliza5/starprod/embedding/P06ib
Omega_112_strange: /eliza5/starprod/embedding/P06ib
Omega_113_strange: /eliza5/starprod/embedding/P06ib
Phi_101_1163628205: /eliza5/starprod/embedding/P06ib
 Pi0_101_1163628205: /eliza5/starprod/embedding/P06ib
Pi0_102_1163628205: /eliza5/starprod/embedding/P06ib
Pi0_103_1163628205: /eliza5/starprod/embedding/P06ib
Xi_111_strange: /eliza5/starprod/embedding/P06ib
Xi_112_strange: /eliza5/starprod/embedding/P06ib
Xi_113_strange: /eliza5/starprod/embedding/P06ib

P05if:
Eminus_503_1154003879: /eliza12/starprod/embedding/P05if
Eminus_504_1154003879: /eliza12/starprod/embedding/P05if
Eplus_503_1154003879: /eliza12/starprod/embedding/P05if
Eplus_504_1154003879: /eliza12/starprod/embedding/P05if
Eplus_505_1154003879: /eliza12/starprod/embedding/P05if

P05ic:
K0Short_122_strange: /auto/pdsfdv40/starprod/embedding/P05ic
K0Short_124_strange: /auto/pdsfdv40/starprod/embedding/P05ic
K0Short_131_strange: /auto/pdsfdv40/starprod/embedding/P05ic
K0Short_132_strange: /auto/pdsfdv40/starprod/embedding/P05ic
Lambda_112_strange: /auto/pdsfdv40/starprod/embedding/P05ic
Lambda_114_strange: /auto/pdsfdv40/starprod/embedding/P05ic
Lambda_122_strange: /auto/pdsfdv40/starprod/embedding/P05ic
Lambda_124_strange: /auto/pdsfdv40/starprod/embedding/P05ic
Lambda_131_strange: /auto/pdsfdv40/starprod/embedding/P05ic
Lambda_132_strange: /auto/pdsfdv40/starprod/embedding/P05ic
Piminus_101_spectra: /auto/pdsfdv40/starprod/embedding/P05ic
 Piplus_101_spectra: /auto/pdsfdv40/starprod/embedding/P05ic
Pbar_201_spectra: /auto/pdsfdv45/starprod/embedding/P05ic
 Pbar_202_spectra: /auto/pdsfdv45/starprod/embedding/P05ic
Phi_115_1118150698: /auto/pdsfdv45/starprod/embedding/P05ic
Phi_116_1118150698: /auto/pdsfdv45/starprod/embedding/P05ic
Phi_117_1118150698: /auto/pdsfdv45/starprod/embedding/P05ic
Piplus_201_spectra: /auto/pdsfdv45/starprod/embedding/P05ic
Piplus_202_spectra: /auto/pdsfdv45/starprod/embedding/P05ic
K0Short_101_strange: /dante3/starprod/embedding/P05ic
K0Short_102_strange: /dante3/starprod/embedding/P05ic
Jpsi_102_spectra: /eliza1/starprod/embedding/P05ic
Jpsi_103_spectra: /eliza1/starprod/embedding/P05ic
Jpsi_104_spectra: /eliza1/starprod/embedding/P05ic
Jpsi_105_spectra: /eliza1/starprod/embedding/P05ic
Piplus_201_ftpcw: /eliza1/starprod/embedding/P05ic
Sigma1385minus_301_strange: /eliza1/starprod/embedding/P05ic
Sigma1385plus_301_strange: /eliza1/starprod/embedding/P05ic
Sigma1385plus_302_strange: /eliza1/starprod/embedding/P05ic
Omega_101_strange: /eliza5/starprod/embedding/P05ic
 Omega_102_strange: /eliza5/starprod/embedding/P05ic
Photon_201_spectra: /eliza5/starprod/embedding/P05ic
Photon_202_spectra: /eliza5/starprod/embedding/P05ic
Photon_203_spectra: /eliza5/starprod/embedding/P05ic
Photon_204_spectra: /eliza5/starprod/embedding/P05ic
Photon_205_spectra: /eliza5/starprod/embedding/P05ic
Photon_206_spectra: /eliza5/starprod/embedding/P05ic
Photon_207_spectra: /eliza5/starprod/embedding/P05ic
Photon_208_spectra: /eliza5/starprod/embedding/P05ic
Photon_209_spectra: /eliza5/starprod/embedding/P05ic
Photon_210_spectra: /eliza5/starprod/embedding/P05ic
Photon_211_spectra: /eliza5/starprod/embedding/P05ic
Photon_212_spectra: /eliza5/starprod/embedding/P05ic
Photon_213_spectra: /eliza5/starprod/embedding/P05ic
Photon_214_spectra: /eliza5/starprod/embedding/P05ic
Photon_215_spectra: /eliza5/starprod/embedding/P05ic
Photon_301_spectra: /eliza5/starprod/embedding/P05ic
Photon_302_spectra: /eliza5/starprod/embedding/P05ic
Photon_303_spectra: /eliza5/starprod/embedding/P05ic
Photon_304_spectra: /eliza5/starprod/embedding/P05ic
Photon_305_spectra: /eliza5/starprod/embedding/P05ic
Photon_306_spectra:  /eliza5/starprod/embedding/P05ic
Photon_308_spectra: /eliza5/starprod/embedding/P05ic
Photon_309_spectra: /eliza5/starprod/embedding/P05ic
Photon_310_spectra: /eliza5/starprod/embedding/P05ic
Photon_311_spectra: /eliza5/starprod/embedding/P05ic
Photon_312_spectra: /eliza5/starprod/embedding/P05ic
Photon_313_spectra: /eliza5/starprod/embedding/P05ic
Photon_314_spectra: /eliza5/starprod/embedding/P05ic
Photon_315_spectra: /eliza5/starprod/embedding/P05ic
Photon_501_spectra: /eliza5/starprod/embedding/P05ic
Photon_502_spectra: /eliza5/starprod/embedding/P05ic
Photon_503_spectra: /eliza5/starprod/embedding/P05ic
Photon_504_spectra: /eliza5/starprod/embedding/P05ic
Piminus_101_spectra: /eliza5/starprod/embedding/P05ic
Piminus_118_spectra: /eliza5/starprod/embedding/P05ic
Piminus_119_spectra: /eliza5/starprod/embedding/P05ic
Piminus_120_spectra: /eliza5/starprod/embedding/P05ic
Piminus_121_spectra: /eliza5/starprod/embedding/P05ic
Piminus_122_spectra: /eliza5/starprod/embedding/P05ic
Piplus_118_spectra: /eliza5/starprod/embedding/P05ic
Piplus_119_spectra: /eliza5/starprod/embedding/P05ic
 Piplus_120_spectra: /eliza5/starprod/embedding/P05ic
Piplus_121_spectra: /eliza5/starprod/embedding/P05ic
Piplus_122_spectra: /eliza5/starprod/embedding/P05ic
Piplus_212_ftpcw: /eliza5/starprod/embedding/P05ic
Xi_101_strange: /eliza5/starprod/embedding/P05ic
Xi_102_strange: /eliza5/starprod/embedding/P05ic
Xi_103_strange: /eliza5/starprod/embedding/P05ic
Xi_104_strange: /eliza5/starprod/embedding/P05ic
Xi_105_strange: /eliza5/starprod/embedding/P05ic
 Xi_110_strange: /eliza5/starprod/embedding/P05ic
 Xi_111_strange: /eliza5/starprod/embedding/P05ic
Xi_112_strange: /eliza5/starprod/embedding/P05ic
Xi_113_strange: /eliza5/starprod/embedding/P05ic
Xi_114_strange: /eliza5/starprod/embedding/P05ic
Xi1530_405_strange: /eliza5/starprod/embedding/P05ic
Xi1530_406_strange: /eliza5/starprod/embedding/P05ic
Xi1530_407_strange: /eliza5/starprod/embedding/P05ic
Xi1530_408_strange: /eliza5/starprod/embedding/P05ic

P04ik:
dbar_101_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
dbar_102_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
 dbar_103_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
 dbar_104_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
 dbar_105_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
 dbar_106_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
 dbar_107_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
dbar_108_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
dbar_109_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
dbar_110_200GeV: /auto/pdsfdv37/starprod/embedding/P04ik
Pi0_300_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_301_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
 Pi0_302_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_303_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_304_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_305_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_306_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_307_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_308_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_309_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_310_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_311_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_312_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_313_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_314_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_315_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_316_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_317_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_318_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
Pi0_319_1112996078: /auto/pdsfdv40/starprod/embedding/P04ik
dbar_101_200GeV: /eliza1/starprod/embedding/P04ik
Pi0_282_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_283_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_284_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_285_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_286_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_287_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_288_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_289_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_290_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_291_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_292_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_293_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_294_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_295_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_296_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_297_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_298_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_299_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_311_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_312_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_313_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_314_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_315_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_316_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_317_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_318_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_319_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_321_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_322_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_323_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_324_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_325_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_326_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_327_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_328_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_329_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_331_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_332_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_333_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_334_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_335_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_336_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_337_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_338_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_339_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_340_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_341_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_341_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_342_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_342_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_343_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_343_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_344_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_344_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_345_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_345_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_346_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_346_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_347_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_347_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_348_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_348_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_349_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_349_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_350_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_350_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_351_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_351_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_352_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_352_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_353_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_353_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_354_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_354_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_355_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_355_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_356_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_356_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_357_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_357_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_358_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_358_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_359_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_359_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_360_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_360_flatpt: /eliza1/starprod/embedding/P04ik
 Pi0_361_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_361_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_362_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_362_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_363_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_363_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_364_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_364_flatpt: /eliza1/starprod/embedding/P04ik
 Pi0_365_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_365_flatpt: /eliza1/starprod/embedding/P04ik P
i0_366_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_366_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_367_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_367_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_368_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_368_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_369_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_369_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_370_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_370_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_371_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_371_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_372_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_372_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_373_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_373_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_374_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_374_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_375_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_375_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_376_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_376_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_377_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_377_flatpt: /eliza1/starprod/embedding/P04ik
Pi0_378_dalitz: /eliza1/starprod/embedding/P04ik
 Pi0_378_flatpt: /eliza1/starprod/embedding/P04ik
 Pi0_379_dalitz: /eliza1/starprod/embedding/P04ik
Pi0_379_flatpt: /eliza1/starprod/embedding/P04ik

P04ih: P04if:
Pi0_106_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_107_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_108_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_109_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_110_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_111_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_112_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_113_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_114_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_115_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_116_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_120_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_121_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_122_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_123_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_124_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_125_EMC:  /auto/pdsfdv37/starprod/embedding/P04if
Pi0_127_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_128_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_130_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_141_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_142_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_143_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_144_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_145_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_146_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_147_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_148_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_149_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_150_EMC:  /auto/pdsfdv37/starprod/embedding/P04if
Pi0_161_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_162_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_163_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_164_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_165_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_166_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_167_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_168_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_169_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_170_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_171_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_172_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_173_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_174_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_175_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_176_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_177_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_178_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_179_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_180_EMC:  /auto/pdsfdv37/starprod/embedding/P04if
Pi0_182_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_183_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_184_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_185_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_186_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_187_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_188_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_189_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_190_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_191_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_192_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_193_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_194_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_195_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_196_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_197_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_198_EMC: /auto/pdsfdv37/starprod/embedding/P04if
Pi0_199_EMC: /auto/pdsfdv37/starprod/embedding/P04if

Sigma1385barminus_501_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barminus_502_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barminus_503_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barminus_504_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barminus_505_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barplus_502_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barplus_503_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barplus_504_heavy: /eliza5/starprod/embedding/P04if
Sigma1385barplus_505_heavy: /eliza5/starprod/embedding/P04if
Sigma1385minus_501_heavy: /eliza5/starprod/embedding/P04if
Sigma1385minus_502_heavy: /eliza5/starprod/embedding/P04if
Sigma1385minus_503_heavy: /eliza5/starprod/embedding/P04if
Sigma1385minus_504_heavy: /eliza5/starprod/embedding/P04if
Sigma1385minus_505_heavy: /eliza5/starprod/embedding/P04if
Sigma1385plus_501_heavy: /eliza5/starprod/embedding/P04if
Sigma1385plus_502_heavy: /eliza5/starprod/embedding/P04if
Sigma1385plus_503_heavy: /eliza5/starprod/embedding/P04if
Sigma1385plus_504_heavy: /eliza5/starprod/embedding/P04if
Sigma1385plus_505_heavy: /eliza5/starprod/embedding/P04if

P04ie:
 Lambda_301Svt_62GeV: /auto/pdsfdv45/starprod/embedding/P04ie
Xi_201_62GeV: /auto/pdsfdv65/starprod/embedding/P04ie
Xi_202_62GeV: /auto/pdsfdv65/starprod/embedding/P04ie
Xi_203_62GeV: /auto/pdsfdv65/starprod/embedding/P04ie
K0Short_201_62GeV: /eliza1/starprod/embedding/P04ie
K0Short_202_62GeV: /eliza1/starprod/embedding/P04ie
K0Short_203_62GeV: /eliza1/starprod/embedding/P04ie
K0Short_211_62GeV: /eliza1/starprod/embedding/P04ie
Lambda_211_62GeV: /eliza1/starprod/embedding/P04ie

P03ih:
Pi0_502_dAu: /eliza5/starprod/embedding/P03ih
P03if:
 TEST: /eliza1/starprod/embedding/P03if
test_sigma: /eliza1/starprod/embedding/P03if
P03ie: P03id: P02ge: P02gd: Xi_noAcc_101_Central: /eliza1/starprod/embedding/P02gd K0Short_P02gd_101_Minbias: /eliza5/starprod/embedding/P02gd K0Short_P02gd_102_Minbias: /eliza5/starprod/embedding/P02gd K0Short_P02gd_103_Minbias: /eliza5/starprod/embedding/P02gd

LPP group (Dubna) activities

This page is intended to provide embedding progress for soft-photon study conducted by PPL (Dubna, JINR) group in Russia.
    Requested embedding is based on request ID # 1103209240 and modified to:
  • provide larger statistics for reco efficiency calculation
  • extend acceptance parameters

AuAu200 photon embedding details

    Request details:

  • Dataset and production: AuAu200, P05ic
  • Embedded particles and multiplicity: 1000 Photons per event, total 500 kEvents
  • zVertex = +-30
  • Pt: 0.020 - 0.160 GeV
  • eta: +-1.2
  • full geant reconstruction
    Embedding Files on Disk at PDSF:

  • Photon_201...215_spectra: /eliza5/starprod/embedding/P05ic/
  • Photon_301...315_spectra: /eliza5/starprod/embedding/P05ic/
  • Photon_401...415_spectra: /eliza7/starprod/embedding/P05ic/
  • Photon_501...504_spectra: /eliza5/starprod/embedding/P05ic/
  • Photon_505...515_spectra: /eliza7/starprod/embedding/P05ic/
  • Photon_601...615_spectra: /eliza7/starprod/embedding/P05ic/
  • Photon_701...715_spectra: /eliza12/starprod/embedding/P05ic/
  • Photon_802...811,817,818_spectra: /eliza12/starprod/embedding/P05ic/
    Running Jobs status:
    jobs QA and checking if this statistic is enough

dAu photon embedding details

    Request details:
  • Dataset and production: dAu200, P04if
  • Embedded particles and multiplicity: 1000 Photons per event, total 500 kEvents
  • zVertex = +-50
  • Pt: 0.020 - 0.160 GeV
  • eta: +-1.2
  • full geant reconstruction
    Running Jobs status: have not been started yet

QA Status (OBSOLETE)

The full list of STAR embedding requests (Since August, 2010):
http://drupal.star.bnl.gov/STAR/starsimrequest
The QA status of each request can be viewed in its history.

The information below are only valid for OLD embedding requests.

This document is intended to provide all the information available for each production, starting for the last or current production.

P06ib

K star

Files: pdsf>/eliza1/starprod/embedding/KstarM* ( July 31 2007)

Plots for Dedx, Mips, eta, phi distributions and vertex position on K star embedding are attached. Just around 20% of K star Minus files  could be scanned due to space issues (in my home dir) at pdsf. Reconstructed tracks were done just on Daugther Pions. There is little difference(0.036) in the mips position possible due to lack of statistics.

Plots - Dca Distributions


The Following are the results for dca distributions for different pT bins in Pi Minus. The results of dca agrees reasonable for the cut-off of 2 cm and above. These plots were done using the macro plot_dca.C

Event Selection Criteria

    * Production: P06ib
    * Vertex restriction: |x|<3.5cm, |y|<3.5cm, |z|<25.0cm
    * N Fit points >= 25
    * DCA < 3 cm

  • PiMinus

 P06ib: Plots-DCA-PiMinus-dca_0.20pT0.30_0nch1000

P06ib: Plots-DCA-PiMinus-dca_0.30pT0.40_0nch1000

P06ib: Plots-DCA-PiMinus-dca_0.40pT0.50_0nch1000
P06ib: Plots-DCA-PiMinus-dca_0.50pT0.60_0nch1000
  0.20<pT<0.30
  0.30<pT<0.40
 0.40<pT<0.50  0.50<pT<0.60
    
P06ib: Plots-DCA-PiMinus-dca_0.60pT0.70_0nch1000
P06ib: Plots-DCA-PiMinus-dca_0.70pT0.80_0nch1000
P06ib: Plots-DCA-PiMinus-dca_0.80pT0.90_0nch1000
 P06ib: Plots-DCA-PiMinus-dca_0.90pT1.00_0nch1000
  0.60<pT<0.70  0.70<pT<0.80
  0.80<pT<0.90
 0.90<pT<1.0
  • PiPlus
P06ib: Plots-DCA-PiPlus-dca_0.20pT0.30_0nch1000

P06ib: Plots-DCA-PiPlus-dca_0.30pT0.40_0nch1000

P06ib: Plots-DCA-PiPlus-dca_0.40pT0.50_0nch1000
P06ib: Plots-DCA-PiPlus-dca_0.50pT0.60_0nch1000
  0.20<pT<0.30
  0.30<pT<0.40
 0.40<pT<0.50  0.50<pT<0.60
    
P06ib: Plots-DCA-PiPlus-dca_0.60pT0.70_0nch1000
P06ib: Plots-DCA-PiPlus-dca_0.70pT0.80_0nch1000
P06ib: Plots-DCA-PiPlus-dca_0.80pT0.90_0nch1000
 P06ib: Plots-DCA-PiPlus-dca_0.90pT1.00_0nch1000
  0.60<pT<0.70  0.70<pT<0.80
  0.80<pT<0.90
 0.90<pT<1.0
  • K Minus

 P06ib: Plots-DCA-KMinus-dca_0.20pT0.30_0nch1000

P06ib: Plots-DCA-KMinus-dca_0.30pT0.40_0nch1000

P06ib: Plots-DCA-KMinus-dca_0.40pT0.50_0nch1000
P06ib: Plots-DCA-KMinus-dca_0.50pT0.60_0nch1000
  0.20<pT<0.30
  0.30<pT<0.40
 0.40<pT<0.50  0.50<pT<0.60
    
P06ib: Plots-DCA-KMinus-dca_0.60pT0.70_0nch1000
P06ib: Plots-DCA-KMinus-dca_0.70pT0.80_0nch1000
P06ib: Plots-DCA-KMinus-dca_0.80pT0.90_0nch1000
 P06ib: Plots-DCA-KMinus-dca_0.90pT1.00_0nch1000
  0.60<pT<0.70  0.70<pT<0.80
  0.80<pT<0.90
 0.90<pT<1.0


  • K Plus
P06ib: Plots-DCA-KPlus-dca_0.20pT0.30_0nch1000

P06ib: Plots-DCA-KPlus-dca_0.30pT0.40_0nch1000

P06ib: Plots-DCA-KPlus-dca_0.40pT0.50_0nch1000
P06ib: Plots-DCA-KPlus-dca_0.50pT0.60_0nch1000
  0.20<pT<0.30
  0.30<pT<0.40
 0.40<pT<0.50  0.50<pT<0.60
    
P06ib: Plots-DCA-KPlus-dca_0.60pT0.70_0nch1000
P06ib: Plots-DCA-KPlus-dca_0.70pT0.80_0nch1000
P06ib: Plots-DCA-KPlus-dca_0.80pT0.90_0nch1000
 P06ib: Plots-DCA-KPlus-dca_0.90pT1.00_0nch1000
  0.60<pT<0.70  0.70<pT<0.80
  0.80<pT<0.90
 0.90<pT<1.0

  • Proton
P06ib: Plots-DCA-Proton-dca_0.40pT0.50_0nch1000

P06ib: Plots-DCA-Proton-dca_0.50pT0.60_0nch1000

P06ib: Plots-DCA-Proton-dca_0.80pT0.90_0nch1000
P06ib: Plots-DCA-Proton-dca_0.90pT1.00_0nch1000
  0.40<pT<0.50
  0.50<pT<0.60
 0.80<pT<0.90  0.90<pT<1.00

  • P  bar
P06ib: Plots-DCA-Pbar-dca_0.40pT0.50_0nch1000

P06ib: Plots-DCA-Pbar-dca_0.50pT0.60_0nch1000

P06ib: Plots-DCA-Pbar-dca_0.80pT0.90_0nch1000
P06ib: Plots-DCA-Pbar-dca_0.90pT1.00_0nch1000
  0.40<pT<0.50
  0.50<pT<0.60
 0.80<pT<0.90 0.90 < pT < 1.00

Plots - Number of Fitted Points

The Following are the results for dca distributions for different pT bins in Pi Minus. The results of dca agrees reasonable for the cut-off of 2 cm and above. These plots were done using the macro plot_dca.C

Event Selection Criteria

    * Production: P06ib
    * Vertex restriction: |x|<3.5cm, |y|<3.5cm, |z|<25.0cm
    * N Fit points >= 25
    * DCA < 3 cm

  • Pi Minus

 P06ib: Plots_Nfit-PiMinus-nfit_0.20pT0.30_0nch1000P06ib: Plots_Nfit-PiMinus-nfit_0.30pT0.40_0nch1000 P06ib: Plots_Nfit-PiMinus-nfit_0.40pT0.50_0nch1000
P06ib: Plots_Nfit-PiMinus-nfit_0.50pT0.60_0nch1000
 0.20 < pT < 0.30 0.30 < pT < 0.40 0.40 < pT < 0.50 0.50 < pT < 0.60
    
    


  • Pi Plus





Pi Minus

P06ib: Plots_Nfit-PiMinus-nfit_0.20pT0.30_0nch1000
P06ib: Pi minus 0.20pT0.30_0nch1000

      

P06ib: Plots_Nfit-PiMinus-nfit_0.30pT0.40_0nch1000
P06ib: Pi minus 0.30pT0.40_0nch1000
 
P06ib: Plots_Nfit-PiMinus-nfit_0.40pT0.50_0nch1000
P06ib: Pi minus 0.40pT0.50_0nch1000

      

P06ib: Plots_Nfit-PiMinus-nfit_0.50pT0.60_0nch1000
P06ib: Pi minus 0.50pT0.60_0nch1000
 
P06ib: Plots_Nfit-PiMinus-nfit_0.60pT0.70_0nch1000
P06ib: Pi minus 0.60pT0.70_0nch1000

      

P06ib: Plots_Nfit-PiMinus-nfit_0.70pT0.80_0nch1000
P06ib: Pi minus 0.70pT0.80_0nch1000
 
P06ib: Plots_Nfit-PiMinus-nfit_0.80pT0.90_0nch1000
P06ib: Pi minus 0.80pT0.90_0nch1000

      

P06ib: Plots_Nfit-PiMinus-nfit_0.90pT1.00_0nch1000
P06ib: Pi minus 0.90pT1.00_0nch1000
 
P06ib: Plots_Nfit-PiMinus-nfit_1.00pT1.10_0nch1000
P06ib: Pi minus 1.00pT1.10_0nch1000

      

P06ib: Plots_Nfit-PiMinus-nfit_1.10pT1.20_0nch1000
P06ib: Pi minus 1.10pT1.20_0nch1000

Pi Plus

P06ib: Plots_Nfit-PiPlusMinus-nfit_0.20pT0.30_0nch1000
P06ib: Pi plus 0.20pT0.30_0nch1000

      

P06ib: Plots_Nfit-PiPlus-nfit_0.30pT0.40_0nch1000
P06ib: Pi plus 0.30pT0.40_0nch1000
 
P06ib: Plots_Nfit-PiPlus-nfit_0.40pT0.50_0nch1000
P06ib: Pi plus 0.40pT0.50_0nch1000

      

P06ib: Plots_Nfit-PiPlus-nfit_0.50pT0.60_0nch1000
P06ib: Pi plus 0.50pT0.60_0nch1000
 
P06ib: Plots_Nfit-PiPlus-nfit_0.60pT0.70_0nch1000
P06ib: Pi plus 0.60pT0.70_0nch1000

      

P06ib: Plots_Nfit-PiPlus-nfit_0.70pT0.80_0nch1000
P06ib: Pi Plus 0.70pT0.80_0nch1000
 
P06ib: Plots_Nfit-PiPlus-nfit_0.80pT0.90_0nch1000
P06ib: Pi plus 0.80pT0.90_0nch1000

      

P06ib: Plots_Nfit-PiPlus-nfit_0.90pT1.00_0nch1000
P06ib: Pi plus 0.90pT1.00_0nch1000
 
P06ib: Plots_Nfit-PiPlus-nfit_1.00pT1.10_0nch1000
P06ib: Pi plus 1.00pT1.10_0nch1000

      

P06ib: Plots_Nfit-PiPlus-nfit_1.10pT1.20_0nch1000
P06ib: Pi plus 1.10pT1.20_0nch1000

K Minus

P06ib: Plots_Nfit-KMinus-nfit_0.20pT0.30_0nch1000
P06ib: K minus 0.20pT0.30_0nch1000

      

P06ib: Plots_Nfit-KMinus-nfit_0.30pT0.40_0nch1000
P06ib: K minus 0.30pT0.40_0nch1000
 
P06ib: Plots_Nfit-KMinus-nfit_0.40pT0.50_0nch1000
P06ib: K minus 0.40pT0.50_0nch1000

      

P06ib: Plots_Nfit-KMinus-nfit_0.50pT0.60_0nch1000
P06ib: K minus 0.50pT0.60_0nch1000
 
P06ib: Plots_Nfit-KMinus-nfit_0.60pT0.70_0nch1000
P06ib: K minus 0.60pT0.70_0nch1000

      

P06ib: Plots_Nfit-KMinus-nfit_0.70pT0.80_0nch1000
P06ib: K minus 0.70pT0.80_0nch1000
 
P06ib: Plots_Nfit-KMinus-nfit_0.80pT0.90_0nch1000
P06ib: K minus 0.80pT0.90_0nch1000

      

P06ib: Plots_Nfit-KMinus-nfit_0.90pT1.00_0nch1000
P06ib: K minus 0.90pT1.00_0nch1000
 
P06ib: Plots_Nfit-KMinus-nfit_1.00pT1.10_0nch1000
P06ib: K minus 1.00pT1.10_0nch1000

      

P06ib: Plots_Nfit-KMinus-nfit_1.10pT1.20_0nch1000
P06ib: K minus 1.10pT1.20_0nch1000

K Plus

P06ib: Plots_Nfit-KPlusMinus-nfit_0.20pT0.30_0nch1000
P06ib: K plus 0.20pT0.30_0nch1000

      

P06ib: Plots_Nfit-KPlus-nfit_0.30pT0.40_0nch1000
P06ib: K plus 0.30pT0.40_0nch1000
 
P06ib: Plots_Nfit-KPlus-nfit_0.40pT0.50_0nch1000
P06ib: K plus 0.40pT0.50_0nch1000

      

P06ib: Plots_Nfit-KPlus-nfit_0.50pT0.60_0nch1000
P06ib: K plus 0.50pT0.60_0nch1000
 
P06ib: Plots_Nfit-KPlus-nfit_0.60pT0.70_0nch1000
P06ib: K plus 0.60pT0.70_0nch1000

      

P06ib: Plots_Nfit-KPlus-nfit_0.70pT0.80_0nch1000
P06ib: K Plus 0.70pT0.80_0nch1000
 
P06ib: Plots_Nfit-KPlus-nfit_0.80pT0.90_0nch1000
P06ib: K plus 0.80pT0.90_0nch1000

      

P06ib: Plots_Nfit-KPlus-nfit_0.90pT1.00_0nch1000
P06ib: K plus 0.90pT1.00_0nch1000
 
P06ib: Plots_Nfit-KPlus-nfit_1.00pT1.10_0nch1000
P06ib: K plus 1.00pT1.10_0nch1000

      

P06ib: Plots_Nfit-KPlus-nfit_1.10pT1.20_0nch1000
P06ib: K plus 1.10pT1.20_0nch1000

Proton

P06ib: Plots_Nfit-Proton-nfit_0.20pT0.30_0nch1000
P06ib: Proton 0.20pT0.30_0nch1000

      

P06ib: Plots_Nfit-Proton-nfit_0.30pT0.40_0nch1000
P06ib: Proton 0.30pT0.40_0nch1000
 
P06ib: Plots_Nfit-Proton-nfit_0.40pT0.50_0nch1000
P06ib: Proton 0.40pT0.50_0nch1000

      

P06ib: Plots_Nfit-Proton-nfit_0.50pT0.60_0nch1000
P06ib: Proton 0.50pT0.60_0nch1000
 
P06ib: Plots_Nfit-Proton-nfit_0.60pT0.70_0nch1000
P06ib: Proton 0.60pT0.70_0nch1000

      

P06ib: Plots_Nfit-Proton-nfit_0.70pT0.80_0nch1000
P06ib: Proton 0.70pT0.80_0nch1000
 
P06ib: Plots_Nfit-Proton-nfit_0.80pT0.90_0nch1000
P06ib: Proton 0.80pT0.90_0nch1000

      

P06ib: Plots_Nfit-Proton-nfit_0.90pT1.00_0nch1000
P06ib: Proton 0.90pT1.00_0nch1000
 
P06ib: Plots_Nfit-Proton-nfit_1.00pT1.10_0nch1000
P06ib: Proton 1.00pT1.10_0nch1000

      

P06ib: Plots_Nfit-Proton-nfit_1.10pT1.20_0nch1000
P06ib: Proton 1.10pT1.20_0nch1000

P bar

P06ib: Plots_Nfit-Pbar-nfit_0.20pT0.30_0nch1000
P06ib: Pbar 0.20pT0.30_0nch1000

      

P06ib: Plots_Nfit-Pbar-nfit_0.30pT0.40_0nch1000
P06ib: Pbar 0.30pT0.40_0nch1000
 
P06ib: Plots_Nfit-Pbar-nfit_0.40pT0.50_0nch1000
P06ib: Pbar 0.40pT0.50_0nch1000

      

P06ib: Plots_Nfit-Pbar-nfit_0.50pT0.60_0nch1000
P06ib: Pbar 0.50pT0.60_0nch1000
 
P06ib: Plots_Nfit-Pbar-nfit_0.60pT0.70_0nch1000
P06ib: Pbar 0.60pT0.70_0nch1000

      

P06ib: Plots_Nfit-Pbar-nfit_0.70pT0.80_0nch1000
P06ib: Pbar 0.70pT0.80_0nch1000
 
P06ib: Plots_Nfit-Pbar-nfit_0.80pT0.90_0nch1000
P06ib: Pbar 0.80pT0.90_0nch1000

      

P06ib: Plots_Nfit-Pbar-nfit_0.90pT1.00_0nch1000
P06ib: Pbar 0.90pT1.00_0nch1000
 
P06ib: Plots_Nfit-Pbar-nfit_1.00pT1.10_0nch1000
P06ib: Pbar 1.00pT1.10_0nch1000

      

P06ib: Plots_Nfit-Pbar-nfit_1.20pT1.30_0nch1000
P06ib: Pbar 1.20pT1.30_0nch1000

QA Plots Phi(Feb 2009)

QA P06ib (Phi->K +K)

 

This is the QA for P06ib (Phi- > KK). reconstruction on Global Tracks (Kaons)

1. dEdx

Reconstruction on Kaon Daugthers. Plot shows MOntecarlo tracks and Ghost Tracsks.


  


 

2. DCA Distributions

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots. (MonteCarlo and MuDst) (MuDst taken from pdsf > /eliza12/starprod/reco/cuProductionMinBias/ReversedFullField/P06ib/2005/022/st_physics_adc_6022048_raw*.MuDst.root)

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 


 

3. NFit Distributions

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions. (MonteCarlo and MuDst)

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();


 

 

4. Delta Vertex

 

The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z) (Cuts of vz =30 cm , NFitCut= 25 are applied)

 

5. Z Vertex and X vs Y vertex

 


 

 

 

6. Global Variables : Phi and  Rapidity

 

 

7. Pt

 Embedded Phi meson with flat pt (black)and Reconstructed Kaon Daugther (red).

8. Randomness Plots

The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.

 

 

QA plots Rho (February 16 2009)

QA P06ib (Rho->pi+pi)

 

This is the QA for P06ib (Rho- > pi+pi). reconstruction on Global Tracks (pions)

1. dEdx

 Reconstruction on Pion Daugthers.


  

 

2. DCA Distributions

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 


2b. Compared with MuDst

 

3. NFit Distributions

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();


 

3b. Reconstructed compared with MuDsts

 

 

4. Delta Vertex

 

The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z) (Cuts of vz =30 cm , NFitCut= 25 are applied)

 

5. Z Vertex and X vs Y vertex

 

 

 

 

6. Global Variables : Phi and  Rapidity

 

 

7. Pt

 Embedded Rho meson with flat pt (black)and Recosntructed Pion (red).

8. Randomness Plots

The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.

 

 

QA plots Rho (October 21 2008)

Some QA plots for Rho:

MiniDst files are at PDSF under the path /eliza13/starprod/embedding/p06ib/MiniDst/rho_101/*.minimc.root

MuDst files are at PDSF under /eliza13/starprod/embedding/P06ib/MuDst/10*/*.MuDst.root Reconstruction had been done on PionPlus.

DCA and Nfit Distributions had been scaled by the Integral in different pt ranges

QAPlots_D0

Some QA Plots for D0 located under the path :

/eliza12/starprod/embedding/P06ib/D0_001_1216876386/*

/eliza12/starprod/embedding/P06ib/D0_002_1216876386/ -> Directory empty

Global pairs are used as reconstructed tracks. SOme quality tractst plotiing level were :

Vz Cut 30 cm ; 

NfitCut : 25,

Ncommonhits : 10 ;

maxDca :1 cm ;  Assuming D0- >pi + pi

P07ib

Test done on P07ib - Doing claibration on dE/dx


   


Fraction bla bla


   
 


P05id


P05id Cu+Cu Embedding

Hit level check-up : At the Hit level the results look in good agreement.
  • Missing/Dead Areas (PiMinus): The next graphs show dead sectors  for embeded data and real data as well.
  • Hits-P05id_200cucu-PiMinus_hitsXYeast_p2.gif
    Hits-P05id_200cucu-PiMinus_hitsXYwest_p2.gif
  • Missing/Dead Areas (PiPlus)
  • Hits-P05id_200cucu-PiPlus_hitsXYeast_p2.gif
    Hits-P05id_200cucu-PiPlus_hitsXYwest_p2.gif

  • Track Residuals PiMinus

  • Hits-P05id_200cucu-PiMinus_longResMean_dipAng_H_p2.gif
     Hits-P05id_200cucu-PiMinus_longResMean_dipAng_p2.gif
     Hits-P05id_200cucu-PiMinus_longResMean_z_H_p2.gif
     Hits-P05id_200cucu-PiMinus_longResMean_z_p2.gif
     Hits-P05id_200cucu-PiMinus_longRes_dipAng_H_p2.gif
     Hits-P05id_200cucu-PiMinus_longRes_dipAng_p2.gif
     Hits-P05id_200cucu-PiMinus_longRes_z_H_p2.gif
     Hits-P05id_200cucu-PiMinus_longRes_z_p2.gif
     Hits-P05id_200cucu-PiMinus_transRes_crosAng_H_p2.gif
    Hits-P05id_200cucu-PiMinus_transRes_crosAng_p2.gif
     Hits-P05id_200cucu-PiMinus_transRes_z_H_p2.gif
    Hits-P05id_200cucu-PiMinus_transRes_z_p2.gif

  • Track Residuals PiPlus

  •  Hits-P05id_200cucu-PiPlus_longResMean_dipAng_H_p2.gif
     Hits-P05id_200cucu-PiPlus_longResMean_dipAng_p2.gif
     Hits-P05id_200cucu-PiPlus_longResMean_z_H_p2.gif
     Hits-P05id_200cucu-PiPlus_longResMean_z_p2.gif
     Hits-P05id_200cucu-PiPlus_longRes_dipAng_H_p2.gif
     Hits-P05id_200cucu-PiPlus_longRes_dipAng_p2.gif
     Hits-P05id_200cucu-PiPlus_longRes_z_H_p2.gif
     Hits-P05id_200cucu-PiPlus_longRes_z_p2.gif
     Hits-P05id_200cucu-PiPlus_transRes_crosAng_H_p2.gif
     Hits-P05id_200cucu-PiPlus_transRes_crosAng_p2.gif
     Hits-P05id_200cucu-PiPlus_transRes_z_H_p2.gif
     Hits-P05id_200cucu-PiPlus_transRes_z_p2.gif
  • dEdx Comparisons

  • The following are results  of dEdx calibration from embedding  for Pi Minus. Graphs 1 and 2 are Dedx vs Momentum  grapsh where Green dots are the MonteCarlo Tracks reconstructed from Embedding and  Black dots  are Data. In Graph Number 2, dots are Data and solid  lines show  Bichsel parametrisation with a factor of 1/2 offset. In the Last Graph a  projection on dEdx axis is done and a MIP of 1.26 and 1.18 are shown.

    Dedx-P05id_cucu-dedx_emb_data_rec_piminus.gif

    Dedx-P05id_cucu-dedx_piminus.gif
     Dedx-P05id_cucu-dedx_fit.gif 

P05ic



Missing /Dead Areas

Hits-P05ic_200gev_auau-PiMinus_hitsXYeast_p2.gif
Hits-P05ic_200gev_auau-PiMinus_hitsXYwest_p2.gif

Track Residuals




Hits-P05ic_200gev_auau-PiMinus_longResMean_dipAng_H_p2.gif
Hits-P05ic_200gev_auau-PiMinus_longResMean_dipAng_p2.gif
Hits-P05ic_200gev_auau-PiMinus_longResMean_z_H_p2.gif
Hits-P05ic_200gev_auau-PiMinus_longResMean_z_p2.gif
Hits-P05ic_200gev_auau-PiMinus_longRes_dipAng_H_p2.gif
Hits-P05ic_200gev_auau-PiMinus_longRes_dipAng_p2.gif
Hits-P05ic_200gev_auau-PiMinus_longRes_z_H_p2.gif
Hits-P05ic_200gev_auau-PiMinus_longRes_z_p2.gif
Hits-P05ic_200gev_auau-PiMinus_transRes_crosAng_H_p2.gif
Hits-P05ic_200gev_auau-PiMinus_transRes_crosAng_p2.gif
Hits-P05ic_200gev_auau-PiMinus_transRes_z_H_p2.gif
Hits-P05ic_200gev_auau-PiMinus_transRes_z_p2.gif

P08ic -Jpsi(Test)

QA P08ic J/Psi -> ee+

This is the QA for P08id (jPsi - > ee). Reconstructin on Global Tracks and just Electrons (Positrons).

 

1. dEdx

 


 

2. DCA Distributions

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

 

3. NFit Distributions

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();


 

 

4. Delta Vertex

The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z)

 

5. Z Vertex and X vs Y vertex

 

 

 

 

6. Global Variables : Phi and  Rapidity

 

 

7. Pt

 Embedded J/Psi with flat pt (black)and Recosntructed Electrons (red).

8. Randomness Plots

The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.

 

 

 

 

 

 

 

P08id (AXi -> RecoPiPlus)

AXi-> Lamba + Pion +  ->P + Pion - + Pion +

(03 08 2009)

 

1. Dedx

 

 

 

2.Dca

 

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

3. Nfit

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

 

4. Delta Vertex

 

5. Z Vertex and X vs Y vertex

 

 

6. Global Variables : Phi and  Rapidity

 

 

 

7. pt

 

 

 

8. Randomness

 

 

 

P08id (Lambda->PK)

 

QA of Lambda Embedding with run 8 d+Au on PDSF (sample 15x)

Let's first check some event wise information. They look fine.

Then we check the randomness of the input Monte Carlo (MC) Lambda tracks. The 'phasespace' command in GSTAR is used for sampling the MC tracks. The input is supposed to be flat for pT within [0,8], Y [-1.5,1.5] and Phi [-Pi,Pi]. The 3 plots below show the randomness is OK for this sample. Please notice that Y is rapidity, not pseudo-rapidity.

Then we compare the dedx of reconstructed MC (matched) global tracks (i.e. the daughters of MC Lambda) to those of real (or ghost) tracks, to fix the scale factor. (scale factor = 1.38 ?)

Now we compare the nFitHits distribution of matched global tracks (i.e. the daughters of MC Lambda) and real tracks. The cuts are |eta|<1, nFitHits>25. For matched tracks, nCommonHits>10 cut is applied. From the left plot, we can see, the agreement of nHitFits is good for all pT ranges.

We check the pT, rapdity and Phi distributions of reconstruced (RC) Lambda and input (MC) Lambda. The cut for Lambda reconstruction is very loose. They look normal.

Here, we compare some cut variables from the reconstructed (RC) Lambda to those from real Lambda. Again, as you can see in these plots, the cuts are very loose for Lambda (contribution of background is estimated with rotation method, and has been subtracted). These plots are made for 8 pT bins (with rapidity cut |y|<0.75). The most obvious difference is in DCA of V0 to PV, especially for high pT bin.

P08id (Omega -> RecoPiMinus)

Omega-> Lamba + K -  ->  P + Pion - + K -

(03 08 2009)

 

1. Dedx

  

Reconstruction on pion Minus and Proton  Daugthers. 2 different plots are shown just for the sake of completenees.... Reconsructing on Kaon had very few statistics

 

Reco PionMinusReco Proton
 

 

 

 

2.Dca

 

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

*.Reconstructing on Pion

 

 

 *. Reconstructing on Proton

 

3. Nfit

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

*. Reconstructing Pion

 

 

 

*. Reconstructing Proton

 

 

4. Delta Vertex

 

 When reconstructed in Pion Minus and Proton it turns out to have the same ditreibutions so I just  posted one of them


 

5. Z Vertex and X vs Y vertex

 

 

 

6. Global Variables : Phi and  Rapidity

 

Reco PionReco Kaon
 
 

 

 

7. pt


 

 

8. Randomness

 

 

 

P08id (phi ->KK) (March 05 2009)

 

QA Phi->KK (March 05 2009)

 

1. Dedx

 Reconstruction on Kaon Daugthers. 2 different plots are shown just for the sake of completenees....

 

 

2.Dca

 

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

 

3. Nfit

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

4. Delta Vertex

 

5. Z Vertex and X vs Y vertex

 

 

6. Global Variables : Phi and  Rapidity

 

 

 

 

7. pt

 


8. Randomness

 

 

 

P08id (phi ->KK)

QA P08id (Phi->KK)

 

This is the QA for P08id (phi - > KK). econstructin on Global Tracks and just Kaons. Macro from Xianglei used (I found the QA macro very  familiar). scale factor of 1.38 applied.

1. dEdx

 Reconstruction on Kaon Daugthers. 2 different plots are shown just to see how Montecarlo looks on top of the Ghost Tracks

 

 

2. DCA Distributions

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

 

3. NFit Distributions

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

 

4. Delta Vertex

The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z)

 

5. Z Vertex and X vs Y vertex

 

 

 

6. Global Variables : Phi and  Rapidity

 

7. Pt

 Embedded Phi meson with flat pt (black)and Recosntructed Kaons (red).

8. Randomness Plots

The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.

 

 

 

 

P08id(ALambda -> P, Pion)

 

QA ALambda->P, pi (03 08 2009)

 

1. Dedx

 Reconstruction on Proton and Pion Daugthers. 2 different plots are shown just for the sake of completenees....

 

Reco ProtonReco Pion

 

 

 

2.Dca

 

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

*.Reconstructing on Proton

 

 *. Reconstructing on Pion

 

3. Nfit

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

*. Reconstructing Proton

 

 

*. Reconstructing Pion

 

4. Delta Vertex

 

 When reconstructed in Proton and pions it turns out to have the same ditreibutions so I just  posted one of them

 

5. Z Vertex and X vs Y vertex

 

 

 

6. Global Variables : Phi and  Rapidity

 

Reco ProtonReco Pion

 

 

7. pt


 

 

8. Randomness

 

 

 

 

 

P08id(Xi -> Reco(PiMinus)

 Xi-> Lamba + Pion -  ->P + Pion - + Pion -

(03 08 2009)

 

1. Dedx

 Reconstruction on PI Minus

 

2.Dca

 

An original 3D histogram had been created  and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

 

 

3. Nfit

Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made  in the same pT  and Eta Bins as the DCa distributions.

Pt Bin array  used : { 0.5, 0.6, 0.8, 1.0} (moves down) and

Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)

For the Error bars, i used the option hist->Sumw2();

 

 

4. Delta Vertex

5. Z Vertex and X vs Y vertex

 

 

 

 

6. Global Variables : Phi and  Rapidity

 

 

 

7. pt

 

 

8. Randomness

 

 

 

 

 

Phi->KK(Mar 12)

 

Please find the QA plots here

http://www.star.bnl.gov/protected/lfspectra/xpzhang/talks/phikkembedQA.pdf
 

The data looks good.

Reconstructed phi meson has small rapidity dependence.

 

Yuri_Test_PionMinus_032009

QA PionMinus

 

 

 

 

 

 

 

P04if

    Hit level check-up:
  • Missing/Dead Areas (PiMinus): The next graphs show dead sectors for embeded data and real data as well.
  • Hits-P04if-PiMinus_hitsXYeast_p2.gif

    Hits-P04if-PiMinus_hitsXYwest_p2.gif
  • Missing/Dead Areas (PiPlus)
  • Hits-P04if-PiPlus_hitsXYeast_p2.gif
    Hits-P04if-PiPlus_hitsXYwest_p2.gif

  • Track Residuals PiMinus

  • Hits-P04if-PiMinus_longResMean_dipAng_H_p2.gif
    Hits-P04if-PiMinus_longResMean_dipAng_p2.gif
    Hits-P04if-PiMinus_longResMean_z_p2.gif
    Hits-P04if-PiMinus_longResMean_z_H_p2.gif
    Hits-P04if-PiMinus_longRes_dipAng_p2.gif
    Hits-P04if-PiMinus_longRes_dipAng_H_p2.gif
    Hits-P04if-PiMinus_longRes_z_p2.gif
    Hits-P04if-PiMinus_longRes_z_H_p2.gif
    Hits-P04if-PiMinus_transRes_crosAng_p2.gif
    Hits-P04if-PiMinus_transRes_crosAng_H_p2.gif
    Hits-P04if-PiMinus_transRes_z_p2.gif
    Hits-P04if-PiMinus_transRes_z_H_p2.gif

  • Track Residuals PiPlus

  • Hits-P04if-PiPlus_longResMean_dipAng_H_p2.gif
    Hits-P04if-PiPlus_longResMean_dipAng_p2.gif
     Hits-P04if-PiPlus_longResMean_z_H_p2.gif
    Hits-P04if-PiPlus_longResMean_z_p2.gif
    Hits-P04if-PiPlus_longRes_dipAng_H_p2.gif
    Hits-P04if-PiPlus_longRes_dipAng_p2.gif
    Hits-P04if-PiPlus_longRes_z_H_p2.gif
    Hits-P04if-PiPlus_longRes_z_p2.gif
    Hits-P04if-PiPlus_transRes_crosAng_H_p2.gif
    Hits-P04if-PiPlus_transRes_crosAng_p2.gif
    Hits-P04if-PiPlus_transRes_z_H_p2.gif
    Hits-P04if-PiPlus_transRes_z_p2.gif

    Previous results - May 2002



    This document is intended to provide all the information available for previous productions such that results for Quality Control studies and Productions Cross checks. Part of quality control studies include identification of missing/dead detector areas, distance of closest approach distributions (dca), and fit points distributions(Nfit). As part of production cross checks, some results in centrality dependance and efficiency are shown


    Quality Control


  • dEdx Comparisons

  • These are dedx vs P graphs. All of them show reasonable agreement with data.(Done on May 2002)

    Dedx-dedxPiPlus.gif
    Dedx-dedxKMinus.gif
    Dedx-dedxProton.gif
     Pi Minus
     K Minus
    Proton 
     Dedx-dedxPiMinus.gif
     Dedx-dedxKPlus.gif
    Dedx-dedxPbar.gif
     Pi Plus
     K Plus
     P Bar

  • Dca Distributions

  • PI PLUS.  In the following dca distributions some discrepancy is shown.  Due to secondaries?
     Dca-PiPlusDca_100pt200.gif
    Dca-PiPlusDca_200pt300.gif
    Dca-PiPlusDca_300pt400.gif
     Peripheral 0.1 GeV/c < pT < 0.2 Gev/c
     MinBias 0.2 GeV/c < pT < 0.3 Gev/c
     Central 0.3 GeV/c <pT <  0.4 GeV/c


    PI MINUS. Some discrepancy is shown.  Due to secondaries?
     Dca-PiMinusDca_100pt200.gif
     Dca-PiMinusDca_200pt300.gif
    Dca-PiMinusDca_300pt400.gif
      Peripheral 0.1 GeV/c < pT < 0.2 Gev/c   MinBias 0.2 GeV/c < pT < 0.3 Gev/c   Central 0.3 GeV/c <pT <  0.4 GeV/c

    K PLUS.  In the following dca distributions Good agreement with data is shown.
     Dca-KPlusDca_100pt200.gif
    Dca-KPlusDca_200pt300.gif
    Dca-KPlusDca_300pt400.gif
     Peripheral 0.1 GeV/c < pT < 0.2 Gev/c
     MinBias 0.2 GeV/c < pT < 0.3 Gev/c
     Central 0.3 GeV/c <pT <  0.4 GeV/c

    K MINUS. In the following dca distributions Good agreement with data is shown.

     Dca-KMinusDca_100pt200.gif
     Dca-KMinusDca_200pt300.gif
    Dca-KMinusDca_300pt400.gif
      Peripheral 0.1 GeV/c < pT < 0.2 Gev/c   MinBias 0.2 GeV/c < pT < 0.3 Gev/c   Central 0.3 GeV/c <pT <  0.4 GeV/c


    PROTON. The real data Dca distribution is wider, especially at low pT -> Most likely due to secondary tracks in the sample. A tail from background protons dominating distribution at low pt can be clearly seen. Expected deviation from the primary MC tracks.

     Dca-ProtonDca_200pt300.gif
     Dca-ProtonDca_500pt600.gif
    Dca-ProtonDca_700pt800.gif
      Peripheral 0.2 GeV/c < pT < 0.3 Gev/c   MinBias 0.5 GeV/c < pT < 0.6 Gev/c   Central 0.7 GeV/c <pT <  0.8 GeV/c

    Pbar. The real data Dca distribution is wider, especially at low pT -> Most likely due to secondary tracks in the sample.

     Dca-PbarDca_200pt300.gif
     Dca-PbarDca_300pt400.gif
    Dca-PbarDca_500pt600.gif
      Peripheral 0.2 GeV/c < pT < 0.3 Gev/c   MinBias 0.3 eV/c < pT < 0.4 Gev/c   Central   0.5 GeV/c <pT <  0.6 GeV/c

  • Number of Fitted Points

  • PI PLUS.  Good agreement with data is shown.
    NFit-PiPlusNfit_100pt200.gif
    NFit-PiPlusNfit_200pt300.gif
    NFit-PiPlusNfit_300pt400.gif
     Peripheral 0.1 GeV/c < pT < 0.2 Gev/c
     MinBias 0.2 GeV/c < pT < 0.3 Gev/c
     Central 0.3 GeV/c <pT <  0.4 GeV/c
    PI MINUS.  Good agreement with data is shown.

    NFit-PiMinusNfit_100pt200.gif
    NFit-PiMinusNfit_200pt300.gif
    NFit-PiMinusNfit_300pt400.gif
     Peripheral 0.1 GeV/c < pT < 0.2 Gev/c  Peripheral 0.2 GeV/c < pT < 0.3 Gev/c  Peripheral 0.3 GeV/c < pT < 0.4 Gev/c

    K PLUS.  Good agreement with data is shown.
    NFit-KPlusNfit_100pt200.gif
    NFit-KPlusNfit_200pt300.gif
    NFit-KPlusNfit_300pt400.gif
     Peripheral 0.1 GeV/c < pT < 0.2 Gev/c
     MinBias 0.2 GeV/c < pT < 0.3 Gev/c
     Central 0.3 GeV/c <pT <  0.4 GeV/c

    K MINUS.  Good agreement with data is shown.
    NFit-KMinusNfit_100pt200.gif
    NFit-KMinusNfit_200pt300.gif
    NFit-KMinusNfit_300pt400.gif
     Peripheral 0.1 GeV/c < pT < 0.2 Gev/c  Peripheral 0.2 GeV/c < pT < 0.3 Gev/c  Peripheral 0.3 GeV/c < pT < 0.4 Gev/c

    PROTON.  Good agreement with data is shown.
    NFit-ProtonNfit_200pt300.gif
    NFit-ProtonNfit_300pt400.gif
    NFit-ProtonNfit_500pt600.gif
     Peripheral 0.2 GeV/c < pT < 0.3 Gev/c  Peripheral 0.3 GeV/c < pT < 0.4 Gev/c Peripheral 0.5 GeV/c < pT < 0.6 Gev/c

    Pbar.  Good agreement with data is shown.
    NFit-PbarNfit_200pt300.gif
    NFit-PbarNfit_300pt400.gif
    NFit-PbarNfit_500pt600.gif
     Peripheral 0.2 GeV/c < pT < 0.3 Gev/c  Peripheral 0.3 GeV/c < pT < 0.4 Gev/c Peripheral 0.5 GeV/c < pT < 0.6 Gev/c

    Productions Cross Checks


  • Centrality Dependance

  •  Compare-PiMKMPbar8.gif Compare-PiMKMPbar7.gif Compare-PiMKMPbar6.gif Compare-PiMKMPbar5.gif
     5% central
    5 % - 10% 10% - 20%  20 % - 30 % 
     Compare-PiMKMPbar4.gif Compare-PiMKMPbar3.gif Compare-PiMKMPbar2.gifCompare-PiMKMPbar1.gif
     30% -  40% 40% - 50% 50% - 60% 60% - 80%


  • Reverse Full Field vs Full Field : No asymetry observed


  •  MinBiasCentral 
     Pi Minus
     Compare-PiM_FF-REV0.gifCompare-PiM_FF-REV8.gif
     
     Min Bias
     Central
     K Minus
      Compare-KM_FF-REV0.gif   Compare-KM_FF-REV8.gif
      MinBias Central
     Pbar  Compare-Pbar_FF-REV0.gif Compare-Pbar_FF-REV8.gif 


  • Charge Asymmetry : No asymetry observed for Pi Minus and  K Minus. For Pbar more data are needed



  •  MinBiasCentral 
     Pi Minus
      Compare-PiP-PiM0.gif Compare-PiP-PiM8.gif 
     
     Min Bias
     Central
     K Minus
      Compare-KP-KM0.gif   Compare-KP-KM8.gif
      MinBias Central
     Pbar  Compare-P-Pb0.gif Compare-Pbar_FF-REV8.gif  



    Reconstruction

    Field Issues

    This is meant to be a central location for finding reconstruction-related items which have some field dependences.

    AuAu200 (2005)
    proton-Lambda HBT
    h+/h- ratio (odd pt shape, but same between fields)
    Track and V0 reconstruction (general agreement between fields?)
    AuAu200 (2003)
    Xi decay analyses - use of a greater decay length cut seemed to alleviate the differences seen in Xi+/Xi- ratio

    Reconstruction Code QA

     Cross reference to Reconstruction Code QA

    STAR Reconstruction Software

    STAR   Computing  
    STAR Reconstruction Software
    Hypernews forums: Tracking/Fitting  Event Vertex/Primaries  Subscribe Y. Fisyak, S. Margetis, C. Pruneau

    SVT Alignment and June 2006 review

    Summary pages

    SVT+SSD Alignment, Run VII (2007)

    Alignment

    Software for Tracking Upgrade (Challenge week 01/17/06-01/24/06)

    Agenda

    Result of Kolmogorov's tests for QA histograms for different versions of STAR codes

    1. with dev (August 13, 2005),
    2. with dev (August 27,2005),
    3. with new (SL05f, August 27,2005).
    4. with new (SL05f, August 27,2005).
    5. with dev (August 27,2005, before update bfc) and dev (September 1, 2005).
    6. with dev2 without and with pgf.

    New STAR magnetic field

    When and what dEdx Prediction for P10 has to be used

    Reconstruction plans

    Usage of Truth information in reconstruction.

    STAR track flags.

    ITTF

    integration week January 2004

    p-p software workshop at BNL 2001/11/19

    Final Agenda and Talks . Minutes from the meeting are here.

    Integrated Tracking Task Force (ITTF)

    The official web cite maintained by Claude Pruneau is here .
    See also the STAR Kalman filter documentation created by Claude Pruneau.

    LBL Tracking review report

    Available in MS-WORD and PDF format. Some talks given are linked here.

    Kalman in BABAR

    A note on Kalman is here in .ps format.

    Kalman in ATLAS

    Igor Gavrilenko's presentation for 5/22/00 in power point format, ATLAS internal note with xKalman description.

    Spiros talk on video meeting about it on June/2/2000 is here in power point format.

    Kalman in STAR

    A preliminary writeup of the Kalman implementation in STAR in use during 2001.

    Flagging scheme for tracks failing Kalman fitter (Al Saulys)

    Current Work on Tracking/Fitting Tools

    The group is currently looking into some of the options for improving the global tracking and fitting. These options include an implementation of GEANE as a universal track propagation engine, providing an interface between tracking and geometry/material info, and using Kalman filtering techniques to obtain the best estimation of track parameters.

     

    Kalman Tracking/Fitting Tools

    Kalman literature (Spiros Margetis)

    Kalman Fitter Evaluation page-I (Al Saulys)

    Kalman Fitter Evaluation page-II (Lee Barnby)

    Kalman Fitter for V0 search (Al Saulys)

     

    Vertex finders

    This section relates to vertex finder algorithm in STAR and some model / approach and evaluation results. Vertex finder studies have been historically part of PWG activities under lose technical guidance from S&C, providing framework and a generic approach to include / add more algorithm as our understanding gwo with time.

     Performance of ppLMV- historic note from 2001, by Jan

    References for vertex finder review:

    Event Reconstruction Techniques in NOvA (CHEP 2015)
    http://indico.cern.ch/event/304944/session/2/contribution/277/attachments/578475/796605/chep_reconstruction_2015.pdf

    Vertex finding by sparse model-based clustering (ACAT 2016)
    https://indico.cern.ch/event/397113/session/22/contribution/209/attachments/1215150/1774584/ACAT2016_RF.pdf

    Vertex Reconstruction in the ATLAS Experiment at the LHC
    http://cds.cern.ch/record/1176154/files/ATL-INDET-PUB-2009-001.pdf
    Efficiency of Primary Vertex Reconstruction ATLAS
    http://www.phy.bnl.gov/~partsem/fy12/kirill_slides.pdf

    KFParticle Vertex Finder


    Summary

    J. Lauret, V. Perevoztchikov, D. Smirnov, G. Van Buren, J. C. Webb

    The Goal

    • Re-evaluate performance of the KF and PP vertex finders by reproducing results reported in 2012 by Amilkar/Jonathan/Yuri (see KVF vs PPV, 2012)
      • Conclusions from that study:
        • Overall the KFV primary vertex finding efficiency is somewhat better than that of PPV for the W signal with zero-bias pile-up
        • In case of clean simulated W-boson signal (no pile-up), PPV is better in finding the right vertex. In other words, KFV efficiency does not degrade as much as the PPV one in noisy environment
        • There is an indication that KFV also provides better than PPV vertex efficiency when the vertex rank is taken into account
        • TMVA ranking scheme further significantly improves primary vertex finding efficiency
      • Unaddressed issues
        • Statistically inconsistent samples make it hard to compare the efficiency curves and draw conclusions
        • High impurity of KFV and TMVA ranking schemes at low vertex track multiplicities may lead to selection of fake verticies in some analyses

    The Strategy (this study)

    • Data. We started KFV evaluation by performing the standard W analysis of the spin PWG (Jinlong)
      • The W analysis was carried out using a sub-sample reconstructed using the KFV finder. The only requirement on the vertex to have a positive rank (standard for PPV) was dropped
      • The number of final selected signal events (with identified primary vertices matching a high-E tower) went down by 10% comparing to the PPV finder
        This is inconsistent with 2012 pile-up studies but consistent with expectations from clean MC W-boson samples   
    • Simulation. We proceeded with an MC study similar to the 2012 one
      (Due to lost data samples and lack of complete documentation from the 2012 study it is impossible to exactly reproduce the original plots)
      • To get as close as possible to the reported results we use the following setup:
        • The code from the HEAD of CVS as of October, 2015 (that includes KF, Sti, and other event reconstruction code)
        • For the primary vertex we simulate W events with the setup used by the Spin PWG (Jinlong). Run 13 geometry
        • For pile-up embedding we use Run 13 zero bias data (Currently ~50k events from day 150 only)
        • The code and scripts recovered from various sources has been collected in the following repository: https://github.com/star-bnl/star-travex
          ...a tag will be added when ready to fix the code
      • In this simulation-based study we use the same vertex finding efficiencies as in the 2012 study. They are defined as:
        • The denominator is common and filled with the number of primary MC vertices having tracks with TPC hits >= 15
        • The Overall Efficiency is calculated by counting the number of reconstructed vertices (regardless of their rank) which have been matched to the primary MC vertex (based on idTruth == 1). In other words, the efficiency gives the probability of the vertex finder to reconstruct the true MC vertex in the event
        • The Max Rank Efficiency is similar to the Overall Efficiency but only for the vertices having the maximum rank
        • The Impurity counts the maximum rank reconstructed vertices which do not match the primary MC vertex

    Conclusions and Findings

    • Without pile-up
      • PPV outperforms KFV in finding the primary vertex. KFV has lower efficiency of finding the correct primary vertex with fewer tracks attached to it
      • However, KFV show better efficiency for finding the primary vertex when the ranking is used. The highest rank reconstructed vertex is more likely to be the true primary vertex
        (Note: PPV was not designed to rank the found vertices properly other than assigning a negative rank to likely pile-up vertices)
    • With pile-up
      • In the new W-boson embedding sample (2013) we find significantly better vertex finding efficiencies for KFV over PPV. This is true for both overall finding and ranked vertex efficiencies
      • Finding #1: The impurity is still high and is something to worry about for low multiplicity vertices where the analyzer with a high probability may select a fake vertex if rely on the KFV ranking scheme
      • Finding #2:
        • PPV internaly constraints the number of verticies released to the user based on an early ranking cuts. It means that if the ranking is not optimized there is a good chance to miss the true primary vertex
        • PPV shows the same overall vertex finding efficiency as KFV (86% vs 87%) when the above constraint is removed
    • Data
      • The Spin PWG reports a 10% lower W selection efficiency with KFV vs PPV
    • Finding #3:
      • As we found out the calculation of the Impurity for ranked vertices does not correspond to the original definition. We corrected it, and as expected the following now holds true: Impurity = (1 - Max Rank Efficiency)

    Recommendations

    • Investigate if a procedure for running vertex finders as afterburners on muDst files can be established (given global tracks and their errors are saved)

     

     

     


    November 18, 2015

    The hard coded limit on the number of "bad" vertices has been raised from 5 to 150 in PPV

     

     

     


    November 12, 2015

    Here we looked at a few basic distributions for event observables to see if the embedding sample is consistent with the data. The intention is to understand why PPV and KFV relative performance is reversed in embedding and data samples.

    PPV Embedding PPV Data KFV Embedding KFV Data

     

     

     


    November 11, 2015

    In this test we made sure to use the same primary vertex cuts in PPV finder as in the original W analysis.  As the result the average efficiency increased from 0.62 to 0.76. It is still lower than the KFV efficiency of 0.81 (0.87) (see below0

     

     

     

     


    November 10, 2015

    In the code calculating the impurity, reconstructed verticies which do not have a matching MC vertex were incorrectly ignored from the total count. After fixing this the "red" and "green" curves now add up to 1 as expected. 
    PPV (left) vs KFV (right)

     

     

     


    November 5, 2015

    Removed requirement on the minimum value of the Max Rank vertex rank (<0). PPV (left) vs KFV (right)


    November 1, 2015

    Results from the new 2013 W embedding samples: PPV (left) vs KFV (right)

    The file list used for this embedding sample is: filelist_wbos_embed.txt


    October 6, 2015

    The following plots show vertex finding efficiencies for PPV (left) and KFV (right) as determined from a 50k event sample of Pythia simulated W-boson events without pileup located at:

    /star/institutions/bnl_me/smirnovd/public/w_sim_nopileup_fzd/
    /star/institutions/bnl_me/smirnovd/public/w_sim_nopileup_ppv/
    /star/institutions/bnl_me/smirnovd/public/w_sim_nopileup_kfv/
    

    The following options were used to reconstruct the samples

    BFC_OPTIONS="fzin tpcRS y2014a AgML pxlFastSim istFastSim usexgeom FieldOn MakeEvent VFPPVnoCTB beamline
    Sti NoSsdIt NoSvtIt StiHftC TpcHitMover TpxClu Idst BAna l0 Tree logger genvtx tpcDB bbcSim btofsim tags emcY2
    EEfs geantout evout -dstout IdTruth big clearmem"
    
    BFC_OPTIONS="fzin tpcRS y2014a AgML pxlFastSim istFastSim usexgeom FieldOn MakeEvent KFVertex beamline
    Sti NoSsdIt NoSvtIt StiHftC TpcHitMover TpxClu Idst BAna l0 Tree logger genvtx tpcDB bbcSim btofsim tags emcY2
    EEfs geantout evout -dstout IdTruth big clearmem"
    


    September 24, 2015 Updated: October 1, 2015

    The following plots show vertex finding efficiencies for PPV (left) and KFV (right) as determined from a 50k event sample of Pythia simulated W-boson events without pileup located at:

    /star/institutions/bnl_me/smirnovd/public/amilkar/MuDst/
    /star/institutions/bnl_me/smirnovd/public/amilkar/PPV2012/

    The distribution for KFV is somewhat comparable to the 2011 and 2012 results shown below
    The PPV case was reconstructed without the 'beamline' option.
     


    September 14, 2015

    The following plot with vertex finding efficiencies (default = PPV) was created using the refactored code from Amilkar (github.com/star-bnl/star-travex)
    Here I used 10k events from Run 13 W-boson Pythia embedding simulation from Jinlong located at:

    /star/data19/wEmbedding2013/pp500_production_2013/Wminus-enu_100_20145001/P14ig.SL14g/2013/

    Comparing to the 2011 and 2012 results shown below the efficiency appears to be slightly better for lower multiplicity vertices. The overall average efficiency is slightlyt higher 0.50 vs 0.46

     


    August 05, 2015

    What We Know

    • Recent studies
      • KFV is ~10% less efficient than PPV in finding the primary vertex as established by the STAR W analysis (Jinlong Zhang, You do not have access to view this node)
      • The jet spin analysis seems to be insensitive to the choice of VF... Both PPV and KFV show comparable results (Zilong Chang)
      • Some details below
    • Past studies (Amilkar/Jonathan/Yuri, 2012)
      • p+p 200GeV W simulation without pileup
        Black points: MC vertex was found;
        Red points: MC vertex matched the highest rank vertex;
        Yellow points: MC vertex matched the highest TMVA rank vertex


        Questions:

        What exactly is the difference between 2011 and 2012 years?
        Why KFV shows significantly different efficiency for 2011 and 2012?
        From the above right handside plots: Does it actually mean that the KFV ranking works and works better than the PPV one?
        KFV does give lower efficiency for low multiplicity vertices than PPV But this is with no pileup! Could this explain the 10% loss in the W efficiency? See below for the case with pileup

      • p+p 200GeV W simulation with pileup: 2011 and 2012 embedding
        Black points: MC vertex was found;
        Red points: MC vertex matched the highest rank vertex;
        Yellow points: MC vertex matched the highest TMVA rank vertex


        Questions:

        From the above plots it does look that KFV also outperforms PPV even at low multiplicities with pileup.
        TMVA ranking is better than default one?

    • Heavy flavor analysis uses KFParticle code to refit the primary vertices found by their default vertex finder (MinuitVF?)

    Where we are

    • Reproduce the results presented by Amilkar/Jonathan/Yuri in 2012
      • We established a repository to collect the code needed for this study
        https://github.com/star-bnl/star-travex
        • Amilkar has provided a few macros to create trees for TMVA (Many thanks!) but the code to create the efficiency histograms seems to be missing
      • Use the same data set from 2012 or create a new one using recent library/data for embedding (I think whichever is easier/readily accessible)
      • The recent W embedding request is still being established (Grant Webb, KFParticle Vertex Finder Production Request )
    • In addition to reproducing/confirming the results of the 2012 study we can do a more direct comparison
      • We can build this functionality in star-travex (some basic histograms already available)
      • This study could confirm the results from the evaluation with the W analysis but without the overhead of the analysis
      • An event-by-event comparison seems to be most attractive

    July 22, 2015

    KFParticle Vertex Finder Evaluation with W Analisys

    For the KFV finder test we used the W analysis performed on the p+p 500 GeV data sets reconstructed using the PPV and KFV finders correspondingly:
    /star/data23/reco/pp500_production_2013/ReversedFullField/P15ic_VFPPV/2013/
    /star/data26/reco/pp500_production_2013/ReversedFullField/P15ic_KFvertex_BL/2013/
    
    More details can be found in the following email from Lidia:
    http://www.star.bnl.gov/HyperNews-star/protected/get/starprod/648/1/1/1/1/1/1/3/1.html
    

    The summary: We compared the output yields of the W analyses and found that KFV finds about 10% less W events than the PPV finder. Although, in the standard W analysis (using PPV) the considered vertices required to have a positive rank we removed that requirement and let the framework to consider ALL vertices found by KFV

    References:

    PPV and KFVertex performance comparison based on simulation for y2011 & y2012 pp200 with pile-up - Amilkar/Jonathan/Yuri

    PPV vertex

     Full documentation of PPV vertex evaluation prior to production of 2008 pp data 

     We want to answer the following questions in this prioritize order.

    1.  FMS triggered (forward) events:
      1.  how often  none of PPV vertices agrees with VPD vertex  and why (using 8 cm margin) 
      2. how often PPV vertex best matching VPD vertex is not listed as first  by PPV
    2.  evaluate PPV performance for ~4 mid rapidity triggers in pp 2008
    3. evaluate PPV for dAu or other beams

    For now use the following BFC chain:

    "DbV20080712 pp2008 ITTF OSpaceZ2 OGridLeak3D beamLine VFPPVnoCTB debug logger" 

     

    Run 8 (200 GeV dAu and pp) , by Gene

    2008 evaluation

    Status: Evaluation of current BFC, no changes to PPV code yet

    2008 PPV evaluation plan

    Run #9069005 will be used

    http://online.star.bnl.gov/RunLog/Summary.php?run=9069005
     

     

    Trigger Name Trigger ID # Events daq source file needed daq files expected # triggers
    zerobias 9300 525 st_zerobias all 525
    toftpx 220710 354459 st_toftpx 5 90000
    fms-slow 220981 29914 st_fmsslow 20 20000
    bbc 220000 2646 st_physics all 2646
    bh1-mb 220510 8612 st_physics all 8612
    etot-mb-l2 7 2853 st_physics all 2853
    jp1-mb-l2 8 5676 st_physics all 5676
    bh2-mb-slow 220520 14236 st_physics all 14236

     daq files are located at

    /star/data03/daq/2008/069/

    9069005f 9069005t  9069005z  9069005p
     

    bfc.C will be run in stardev with options:

    root4star -b -q  bfc.C'(1,1e6,"DbV20080820 pp2008 ITTF OSpaceZ2 OGridLeak3D beamLine VFPPVnoCTB debug logger","/star/data03/daq/2008/069/*")'

     

    The verision of PPV includes the August 2008 change that Post Crossing Tracks are dropped which was enacted in response to the change of the TPC cluster finder

     

    MuDst files will be placed at:

    /star/data05/scratch/rjreed/PPV2008Eval/*

     

    Observables to Monitor:

    # primary vertices

    Z location primary vertices

    delta Z between primary vertex z position and bbc or vpd z position

    # tracks associated with each primary vertex

     

    Cuts to Monitor:

    mMinTrkPt (Currently 0.20)

    mMinFitPfrack (Currently at 0.70)

    Include all EEMC rings

    mMaxZradius

    Weights for TPC, EEMC, BEMC

     

     

    2008 PPV performance Revision 1.29

    PPV performance Revision 1.29

    Updated 9/7/2008

    Table 1 : # of Events processed as of Sept 7.
    Trigger Name Trigger ID # # Events Expected # Events Run
    zerobias 9300 525 525
    toftpx 220710 90000 28720
    fms-slow 220981 20000 14299
    bbc 220000 2646 938
    bh1-mb 220510 8612 7507
    etot-mb-l2 7 2853 2506
    jpt-mb-l2 8 5676 4983
    bh2-mb-slow 220520 14236 12507

     

    Table 2: Summary of Vertex finding efficiency and vertex matching with events processed by Sept 2.  For the vpd, matched is defined as the 0 ranked PPV vertex is within 8 cm of the vpd vertex.  For the bbc, matched is defined as the 0 ranked PPV vertex is within 32 cm of the bbc vertex.

    Table 3: Summary of Vertex finding efficiency and vertex matching with events processed by Sept 7.  For the vpd, matched is defined as the 0 ranked PPV vertex is within 20 cm of the vpd vertex.  For the bbc, matched is defined as the 0 ranked PPV vertex is within 60 cm of the bbc vertex.

     

     

    zero bias

    525 Events
    # vertices,#events
    0,461
    1,62
    2,2

    5 Events with vpd + PPV vertex
    15 Events with bbc + PPV vertex

    Figure 2: Z postition of rank 0 vertex for zero bias trigger.  Rank 1 and above excluded due to low statistics.

     

    Figure 3: Rank 0 PPV vertex Vz - vpd Vz for zero bias trigger.

    toftpx

    28720 Events
    # Vertices, # Events
    0,28110
    1,565
    2,39
    3,6

    559 Events with PPV rank 0 + vpd
    393 Events match (within 8 cm)
    41 Events with PPV rank 1 + vpd
    7 Events match (within 8 cm)
    558 Events with PPV rank 0 + bbc
    428 Events match (within 32 cm)
    44 Events with PPV rank 1 + bbc
    20 Events match (within 32 cm)
    509 Events with PPV rank 0 + bbc + vpd
    296 Events match both vpd and bbc

     

    Figure 5: Z position of rank 0 and rank 1 vertices for tof trigger.

    Figure 6: Rank 0 PPV Vz - vpd Vz for tof trigger

    Figure 7: Rank 0 PPV Vz - bbc Vz for the tof trigger

    Figure 8: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the tof trigger.

    fms-slow

    #Vertices,#Events

    0,5504
    1,7503
    2,1173
    3,110
    4,8
    5,1
    2162 Events with PPV rank 0 + vpd
    1435 Events match (within 8 cm)
    389 Events with PPV rank 1 + vpd
    98 Events match (within 8 cm)
    6679 Events with PPV rank 0 + bbc
    4617 Events match (within 32 cm)
    1026 Events with PPV rank 1 + bbc
    453 Events match (within 32 cm)
    1954 Events with PPV rank 0 + bbc + vpd
    992 Events match both vpd and bbc
     

    Figures to be added later.

    bbc

    2646 Events

    # Vertices, # Events
    0,1157
    1,1293
    2,183
    3,12
    500 Events with PPV rank 0 + vpd
    392 Events match (within 20 cm)
    71 Events with PPV rank 1 + vpd
    27 Events match (within 20 cm)
    1409 Events with PPV rank 0 + bbc
    1232 Events match (within 60 cm)
    185 Events with PPV rank 1 + bbc
    114 Events match (within 60 cm)
    484 Events with PPV rank 0 + bbc + vpd
    360 Events match both vpd and bbc


     

    Figure 10: Vz position of PPV rank 0 and rank 1 vertices for the bbc trigger.

     

    Figure 11: PPV Vz - vpd Vz for both rank 0 and rank 1 vertices for the bbc trigger.

     

    Figure 12: PPV Vz - bbc Vz for both rank 0 and rank 1 vertices for bbc trigger

     

     

    Figure 13: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the bbc trigger.

    bh1-mb

    7507 Events

    # Vertices, # Events

    0,856
    1,5494
    2,1032
    3,122
    4,3
    1761 Events with PPV rank 0 + vpd
    1452 Events match (within 20 cm)
    356 Events with PPV rank 1 + vpd
    120 Events match (within 20 cm)
    6275 Events with PPV rank 0 + bbc
    5729 Events match (within 60 cm)
    1082 Events with PPV rank 1 + bbc
    676 Events match (within 60 cm)
    1699 Events with PPV rank 0 + bbc + vpd
    1309 Events match both vpd and bbc


     

    Figure 15: Vz position of PPV rank 0 and rank 1 vertices for bh1 trigger

     

    Figure 16: PPV Vz - vpd Vz for Rank 0 and Rank 1 vertices for bh1 trigger.

     

    Figure 17: PPV Vz - bbc Vz for rank 0 and rank 1 vertices for bh1 trigger

     

     

    Figure 18: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the bh1 trigger.

     

    etot-mb-l2

    2506 Events

    # Vertices, # Events
    0,95
    1,1768
    2,539
    3,96
    4,7
    549 Events with PPV rank 0 + vpd
    380 Events match (within 20 cm)
    204 Events with PPV rank 1 + vpd
    61 Events match (within 20 cm)
    2252 Events with PPV rank 0 + bbc
    2013 Events match (within 60 cm)
    599 Events with PPV rank 1 + bbc
    367 Events match (within 60 cm)
    530 Events with PPV rank 0 + bbc + vpd
    339 Events match both vpd and bbc

     

     

    Figure 20: Vz position of rank 0 and rank 1 vertices for etot trigger

     

     

    Figure 21: PPV Vz - vpd Vz for rank 0 and rank 1 vertices for etot trigger

     

    Figure 22: PPV Vz - bbc Vz for rank 0 and rank 1 vertices for etot trigger

     

    Figure 23: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the etot trigger.

     

    jpt-mb-l2

    4983 Events

    # Vertices, # Events
    0,240
    1,3751
    2,860
    3,120
    4,12
    1114 Events with PPV rank 0 + vpd
    869 Events match (within 20 cm)
    292 Events with PPV rank 1 + vpd
    84 Events match (within 20 cm)
    4419 Events with PPV rank 0 + bbc
    4040 Events match (within 60 cm)
    927 Events with PPV rank 1 + bbc
    553 Events match (within 60 cm)
    1072 Events with PPV rank 0 + bbc + vpd
    777 Events match both vpd and bbc

     

     

     

     

    Figure 25: Vz position of rank 0 and rank 1 vertices for jp1 trigger.

     

     

     

     

    Figure 26: PPV Vz - vpd Vz for jp1 trigger

     

    Figure 27: PPV Vz - bbc Vz for jp1 trigger

     

    Figure 28: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the jp1 trigger.

     

    bh2-mb-slow

    12507 Events

    # Vertices, # Events
    0,1292
    1,9168
    2,1830
    3,209
    4,7
    2970 Events with PPV rank 0 + vpd
    2402 Events match (within 20 cm)
    649 Events with PPV rank 1 + vpd
    216 Events match (within 20 cm)
    10555 Events with PPV rank 0 + bbc
    9625 Events match (within 60 cm)
    1929 Events with PPV rank 1 + bbc
    1200 Events match (within 60 cm)
    2861 Events with PPV rank 0 + bbc + vpd
    2164 Events match both vpd and bbc

    Figure 20: Vz position of rank 0 and rank 1 vertices for etot trigger

     

     

     Figure 26: PPV Vz - bbc Vz for jp1 trigger

     

     

    Figure 23: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the etot trigger.

     

    Figure 27: PPV Vz - vpd Vz for jp1 trigger

     

    Comparison between PPV revision 1.29 and track pT change

    This compares the effiiciency of the current PPV vertex finder (revision 1.29 which includes PCT fix) to the suggested change of including tracks with 0.1 GeV < pT < 0.2 GeV to the algorithim.  Currently PPV only uses tracks with pT > 0.2 GeV.  Since tracks with 0.1 GeV will not reach the BEMC and would be rejected with the PPV algorithim, we've included only tracks with 0.1 < pT <0.2 which cross the central membrane and we did not require them to point to a tower.

    The bfc code is at:

    /star/data05/scratch/rjreed/PPV2008Eval/Vertex5/
     

    The daq files processed as of 9/7/2008 are at:

    /star/data05/scratch/rjreed/PPV2008Eval/MuDstPt100/

     

     

    Table 1:  Efficiency comparison between PPV revision 1.29 and altered PPV which accepts lower pT tracks.  Matching with the vpd is defined as the rank 0 PPV vertex is with 20 cm of the vpd vertex, and matching with the bbc is defined as the rank 0 PPV vertex is within 60 cm of the bbc vertex.

    Table 2: Break down of the number of events used to calculate the efficiencies of the altered PPV vertex algorithim.

    Table 3: PPV revision 1.29 statistics posted here for ease of comparison

     

    PPV Revision 1.29 high luminosity trigger details

     

    The following data was taken with triggers: bh1-mb (ID 220510),  etot-mb-l2 (ID 7). jpt-mb-l2 (ID 8), and  bh2-mb-slow (ID 220520).  The total number of events in this set was 16351.
     

     

    Figure1 : Delta Vz between PPV Vertex closest to the VPD vertex and the vpd.  PPV ranking was ignored.

    Figure 2.  Delta Vz between second closest PPV vertex and vpd vertex.

     

     Figure 3:  This is figure 1 fitted with an unrestrained gaussian.

    Figure 4:  This is Figure 1 and 2 (red) plotted on top of each other with a bin width of 6.3 cm (2 times the sigma fitted in Figure 3.)

    Detailed event comparison PPV & VPD vertex, (Rosi)

     BFC

    "DbV20080712 pp2008 ITTF OSpaceZ2 OGridLeak3D beamLine >VFPPVnoCTB  debug logger"
    star/data05/scratch/balewski/2008-daq/st_fmsslow_9060086_raw_1090001.MuDst.root 

    Detailed event count

    1271 Events ,
    256 Events with no PPV vertex,
    271 Events with vpd vertex,
    144 with at least one PPV vertex matching the vpd,
    136 events with the 0 ranked vertex matching the vpd,
    10 Events with the 1 ranked vertex matching the vpd ,
    49 Events with no PPV vertex  and a vpd Vertex,
    5 Events with multiple vertices matching the vpd ,
    1 Event with 2 ranked Vertex matching the vpd (multiple match),
    1 Event with 3 ranked vertex matching  the vpd (multiple match)
    
    44 Events out of 222 with both PPV and vpd vertex have all vertices
    outside of 50 cm of vpd. 11 of these have more than 1 PPV vertex

     Conclusion: out of 271 events with VPD vertex:

    • 49 (18%) has no PPV vertex
    • 151 (56%) has one of PPV vertex closer than 8 cm from VPD vertex
    • 71 (26%)  none of PPV vertices is closer than 8 cm to VPD vertex

     

    VertexAnalysis8cm.txt file lists (for the events with a vpd vertex) the event #, # of vertices, rank of each vertex that matches vpd.

     

     

    Details of Events with mismatched PPV VPD vertices with vertex location forced

    I've run through 5 events where the vpd and the PPV vertices don't match.   daq file location is:

    star/data05/scratch/balewski/2008-daq/st_fmsslow_9060086_raw_1090001.daq

     

    Here are the PPV Vz values prior to "forcing" the values.  For the histograms, the solid red lines indicate the location of the vpd vertex and the circles indicate the location(s) of the PPV vertices.

    EventID = 749 vpdVz = 98.2939 PPV vertices at Vz = -48.75 140.85

    EventID = 2115 vpdVz = -34.348 PPV vertices at Vz = -106.55

    EventID = 3346 vpdVz = -33.7838 PPV vertices at Vz = 129.35

    EventID = 3952 vpdVz = 0.664319 PPV vertices at Vz =

    EventID = 4447 vpdVz = 25.5813 PPV vertices at Vz = -99.95

     

    For comparison, here is an event where the VPD vertex and PPV vertex were within 8 cm of each other:

     

    All the important files (including the likelihood histograms and track multiplicities) can be found at:

    /star/u/rjreed/Vertex3/Eventdaq*

     

    Here are the results:

    daq #7 EventID = 749 vpdVz = 98.2939 # PPV vertices = 10 at Vz = 90 92 94 96 98 100 102 104 106 108

    Primaries that passed the cut (flag>0,  pt>0.2GeV, nFitP/nPoss>0.51):

    id = 277 flag = 801 Nhits = 7 Npos = 11 pt = 0.370926 frac = 0.636364

    id = 278 flag = 801 Nhits = 6 Npos = 11 pt = 0.250512 frac = 0.545455

    id = 279 flag = 801 Nhits = 6 Npos = 11 pt = 0.271426 frac = 0.545455

    id = 281 flag = 801 Nhits = 8 Npos = 11 pt = 1.57357 frac = 0.727273

    id = 283 flag = 801 Nhits = 6 Npos = 11 pt = 0.638242 frac = 0.545455

    id = 292 flag = 801 Nhits = 5 Npos = 9 pt = 0.292521 frac = 0.555556

    id = 295 flag = 801 Nhits = 7 Npos = 11 pt = 0.213841 frac = 0.636364

     

    daq #22  EventID = 2115 vpdVz = -34.348 N PPV vertices = 10 at Vz = -42 -40 -38 -36 -34 -32 -30 -28 -26 -24

    Primaries that passed the cut:

    id = 48 flag = 301 Nhits = 40 Npos = 45 pt = 0.362589 frac = 0.888889

    id = 63 flag = 301 Nhits = 26 Npos = 34 pt = 0.602728 frac = 0.764706

    id = 64 flag = 301 Nhits = 35 Npos = 45 pt = 0.392374 frac = 0.777778

    id = 66 flag = 301 Nhits = 32 Npos = 45 pt = 0.772336 frac = 0.711111

    id = 288 flag = 801 Nhits = 6 Npos = 11 pt = 2.56192 frac = 0.545455

    id = 290 flag = 801 Nhits = 9 Npos = 11 pt = 0.206602 frac = 0.818182

    id = 291 flag = 801 Nhits = 10 Npos = 11 pt = 0.556056 frac = 0.909091

    id = 292 flag = 801 Nhits = 8 Npos = 11 pt = 0.722384 frac = 0.727273

    id = 296 flag = 801 Nhits = 6 Npos = 11 pt = 0.236179 frac = 0.545455

    id = 297 flag = 801 Nhits = 6 Npos = 11 pt = 0.267664 frac = 0.545455

    id = 298 flag = 801 Nhits = 6 Npos = 11 pt = 0.9994 frac = 0.545455

    id = 299 flag = 801 Nhits = 5 Npos = 8 pt = 0.332944 frac = 0.625

    id = 300 flag = 801 Nhits = 9 Npos = 11 pt = 0.389091 frac = 0.818182

    id = 310 flag = 801 Nhits = 5 Npos = 9 pt = 0.236941 frac = 0.555556

     

    daq #44 EventID = 3346 vpdVz = -33.7838 N PPV vertices = 10 at Vz = -42 -40 -38 -36 -34 -32 -30 -28 -26 -24

    Primaries that passed the cut:

    id = 129 flag = 301 Nhits = 38 Npos = 42 pt = 0.352418 frac = 0.904762

    id = 433 flag = 801 Nhits = 6 Npos = 9 pt = 0.73871 frac = 0.666667

    id = 436 flag = 801 Nhits = 6 Npos = 11 pt = 4.33162 frac = 0.545455

    id = 441 flag = 801 Nhits = 7 Npos = 11 pt = 1.19905 frac = 0.636364

    id = 446 flag = 801 Nhits = 6 Npos = 11 pt = 2.52832 frac = 0.545455

     

    daq #63 EventID = 3952 vpdVz = 0.664319 N PPV vertices = 10 at Vz = -8 -6 -4 -2 0 2 4 6 8 10

    Primaries that passed the cut:

    id = 177 flag = 301 Nhits = 35 Npos = 40 pt = 0.751434 frac = 0.875

    id = 433 flag = 801 Nhits = 7 Npos = 11 pt = 1.16509 frac = 0.636364

    id = 437 flag = 801 Nhits = 6 Npos = 10 pt = 0.277501 frac = 0.6

    id = 440 flag = 801 Nhits = 6 Npos = 11 pt = 6.03469 frac = 0.545455

     

    daq #70 EventID = 4447 vpdVz = 25.5813 N PPV vertices = 10 at Vz = 18 20 22 24 26 28 30 32 34 36

    Primaries that passed the cut:

    id = 324 flag = 311 Nhits = 9 Npos = 10 pt = 0.50257 frac = 0.9

    id = 391 flag = 801 Nhits = 6 Npos = 11 pt = 0.332764 frac = 0.545455

    Jan's notes

     -------- Loose notes, needs cleanup, Jan

    Lidia's most recent test prodcution of 3 runs (89 files) also had
    CTB matching off

     Low Lumi (run9060086), Mid Lumi (run9069059), High Lumi (run 9068124)
     /star/data10/reco/ppProduction2008/ReversedFullField/P08ic_test/2008/*/*
     log files on /star/rcf/prodlog/P08ic_test/log/daq

    --------------

    Akio's event classification  txt file from 3 files (with differnt luminosity) at the bottom of

    http://www.star.bnl.gov/protected/spin/akio/200806/index_5th.html

    from Lidia's test production. I checked that I get identical results
    with Jan's production and Lidia's for the low lumi run.

    -------------

    Just remind you that there is "accidental match" between VPD and
    TPC vertex up to ~25% (@ high lumi) under the peak.

    --------------

    * could you show Rosi how to print VPD & BBC vertex position in BFC?

    Once Mudst is created, you can get VPD vertex (from TOF electronics) by
       StMuEvent->vpdVz()

    For BBC I have not yet implemented the calibrated vertex. Once done,
    one should be able to get it from
       StEvent->triggerDetectorCollection()->bbc()->zVertex()

    --------- Rosi ---------

    While I was doing this, I spun a macro over the MuDst in the folder
    above and duplicated some of Akio's results, just to make sure I
    understand.

     


    So here are the z positions of the ranks 1 and 2 vertices and the vpd:

    Fig 1.

     

    Locations of Akio's pages in protected/spin

    August 13, 3 runs produced with PPV w/o using CTB

    ----- 

     

    PPV vs. VPD , by Xin

    Fig 1. VPD-Minuit, d-Au 2008 events, minB events: ZDC East+VPD

     


    Fig 2. VPD-PPV, p-p 2008 events, st_physics , no trig selection, mostly E & BEMC triggers

     

    Hi Jan, Akio,
    
    Xiaoping helped me to check the test 2008 pp production data Jan suggested. Please take a look at the attached plot. Basically what is plotted here is vz difference between vpd vertex and tpc vertex for different vpd hit configuration, similar to that in Akio's web page, but using the TOF electronics for vpd hit configuration selection.
    
    So firstly we didn't see the ~30cm width gaussian component. The two narrow gaussian components are attributed to the VPD resolution. In the VPD timing resolution, we see a double gaussian structure, and with about a factor of 2-3 difference in widths. This is consistent with what we see here.  The largest ~50-60 cm gaussian component should be due to the fail of TPC VF and it is related to the beam vz distribution.
    
    And secondly, the resolution in the (E,W)>(1,1) configuration is expected better than the configuration of (E,W)=(1,1). The "dilution" in vpd resolution with more hits seems not true to us.
    
    Generally, we don't see an obvious issue on the VPD side. I am not sure if how the result will change when you use the DSM for selection. Or maybe your statistics is not good enough? Or the data are from some bad TOF runs?
    
    Thanks and Best Regards
    
    /xin
    
    Jan Balewski wrote:
    Hi Xin,
    Those 2 analysis do not need to be contradicting.
    There is much less pileup in dAu than in pp. There may be beam background in pp.
    
    Can you investigate this effect in 2008 pp data from production requested by Matt  ~2 weeks ago, it is done, files are on data09,10.
    http://www.star.bnl.gov/HyperNews-star/get/starprod/249/4/1/1/1/1/2.html
    It is 3K daq files, 1M events w/ TPC , 1/4 of events have VPD vertex, ~90 % have TPC vertex produced by  fixed PPV.
    
    Thanks
    Jan
    
    
    
    On Oct 1, 2008, at 11:40 PM, Xin Dong wrote:
    
    Hi Akio,
    
    Thanks for this message. Actually we always see the resolution will be improved if we require more VPD hits. I don't quite understand the ~25cm gaussian distribution at this step. Xiaoping helped me check the dAu data (we don't have TPC vertex in pptoftpx triggered data), you can find the distribution from the attached plot. It shows that with more VPD hit requirement, the vertex resolution is better. No 25cm-width gaussian contribution appears.
    
    So  let me answer your questions directly, see them inline.
    Akio Ogawa wrote:
    Hello
    
    I posted this yesterday to vertex mailing list. I'd like to make
    sure you know this since you may be more intersted than us.
    
    In zVPD-zTPC distribution at pp, we see 3 structures. See Fig 2 (2 gaussian fit) and Fig6 (3 gaussian fit) of
    http://drupal.star.bnl.gov/STAR/blog-entry/rjreed/2008/sep/11/ppv-revision-1-29-high-luminosity-fmsslow-trigger-evaluation 
    
    First is sigma ~3cm peak where TPC and VPD vertex matches. Quite
    reasonable with resolutions of those two vertex finding.
    
    2nd is sigma ~80cm, which is understandable if TPC and VPD picked
    up 2 different vertex. Vertex distribution is ~gaussian with sigma
    ~60cm. If we pick 2 randomly and take difference, then sigma should be sqrt(2)*60cm ~ 85cm.
    
    3rd one is the mistery. Sigma is around 25-30cm. So its much
    narrower than random. Its hard for TPC to "miss" vertex by
    10-20cm, since all track's DCA_z is <3cm.
    
    Rosi changed selection of TPC vertex (more matched tracks) to
    make TPC vertex better, she saw no difference in the structure.
    
    Now if look at the plot at bottom of http://www.star.bnl.gov/protected/spin/akio/200806/index_8th.html
    
    which is essentially same plot but divided by # of VPD hits.
    This 3rd structure with sigma~25cm is most evident when both VPD-E
    and VPD-W has 2 or more hits.
    
    This suggests (at least to me) that when you have more than one hit
    in VPD and taking time average, sometimes you are diluting VPD vertex resolution. This can be "real" (another collisions in the same crossing
    or some beam halo hitting VPD) or detector effects (hot pmt, too
    loose timing cut, etc).
    
    Have you seen this?
    ==>Xin
    No.
    Is there way to get some more info from mudst?
    ==>Xin
    The number of hits and Tdiff information should be available from MuDst. Tdiff cut may help some in resolution, but shouldn't create a 20cm gaussian peak.
    (For example distance or rms of hits included in average?)
    ==>Xin
    I don't quite understand, distance or rms of hits to what?
    Is there some cut you tuned when you accepting hit to form average?
    ==> Xin
    Yes. We have already removed the hits with non-physical timing information (out of trigger timing window, but for sure with 25ns resolution). And we always take the earliest hit.
    (for exapmple maximum time difference?)
    Is average weighted by ToT?
    ==> Xin
    No. Supposedly the ToT dependence is calibrated. We just do simple average.
    Have you tried taking earliest hit only?
    ==> Xin
    Yes.
    
    We will trying to see what the pp data look like. Thanks
    
    /xin
    
    

     

    Plots Jan

     plots

    2009 algo upgrade 1

     The following deficit of the CVS version of PPV have been corrected for in December of 2008

    1.  PPV requires at least 2 matched tracks for a valid primary vertex saved in muDst with positive rank. 

      Problem: For W-events there may be less then 2 tracks in the eta range [-1,+1.4] to satisfy the above requirement. Although recent change in PPV causes additional 5 sub-prime vertices are saved with negative rank, it does not guarantee the vertex containing just a single 20+ GeV track will beat other minBias vertices from the pileup and make to this top 5.

      Remedy: Extend criteria for valid primary vertex and save also those which contain at least one track with pT>15 GeV matched to BTOW or ETOW tower with ADC>=MIP. I do not want to impose ADC>1000 cut, because reco high pT electron track may miss the hit-tower and point to the neighbor one. There will be very few events with such high pT tracks so on average # of vertices per event will not change.

       

      Implementation:  PPV 'natural ranking' for pp events has dynamic range of [0+eps ... 10,000].

      • for events with 2 or more matched tracks PPV rank will be increased by 1000,000 (so vertex will be on the top of the vertex list)
      • for events with  only one high pt track(for W reco)  live the  PPV rank as is (will be in the middle of vertex list)
      • for (the first five) sub-prime vertices (for Akio's study) subtract 1000,000 from 'natural' PPV rank, it will be a negative number

       

      Consequences: The 1-track vertex will be listed after any 2+ track vertex. However, in none 2-track vertex is found the 1-track vertex will be firts on the list with positive rank. I maintain people should not use the top rank PPV vertex blindly but QA prime vertices according to their needs.  


    2. Modified PPV produces vertices with the following rank distribution:

      Fig 1. Example of PPV vertex rank distribution for 200 W-events generated by Pythia. Top plot show vertex rank, X-axis range [-1.2e6, +1.2e6]. The 3 groups of vertices are those with 2 or more matched tracks (most right), 1-track vertices (12 events in the middle), sub-rime vertices (negative rank). Bottom plots shows the same events but Log10(rank) is used on the X-axis to make this 3 categories better visible.


      Fig 2. Example of PPV vertex rank distribution for 200 st_physics 2008 pp events. 

    3. Plot below shows ADC of BTOW for towers pointed by TPC MIP track, for a subset of ~200 towers, from fmsslow pp 2008 events. The red band is pedestal, the green band is MIP peak position. The maximum of MIP distribution as at ADC~17.

      No change needed: PPV is tagging BTOW tower as fired if ADC>8. This item is here because I misremembered sth about PPV code, the plot is correct and nice so I live it for the record.

    4. Fig 3.

       

       

     

    2009 evaluation

     04 : MC study on PPV with BTOF in Run 9 geometry (Xin blog)

    01 transverse vertex , pp 500 data & Pythia (Rosi, Jan)

     Study of vertex reconstruction in transverse X-Y plane, pp 500 data from 2009, high PT events from W-stream

    Large variance in the initial determination of beam line constrain for pp 500 data has been observed. The concern was that reconstruction accuracy for high PT TPC electron tracks from W decay may be not sufficient.

    At first simpleminded  idea of increasing minimal PT of used tracks and imposing high track multiplicity did not improve accuracy of vertex determination in the transverse plane.

    Next we look at individual events passing the following selection criteria, passing through most likely primary tracks candidates from the pool of global tracks:

    • use only global tracks matched to BTOW,ETOW, or central membrane
    • require nFit/nPoss>51%, and Rxy @ DCA to X=Y=0 below 3 cm
    • require PPV finds valid vertex along Z direction, require delZ <3cm
    • require global pT in range [0.8, 8.0] GeV/c
    • require Sti extrapolation error in transverse plane is below 3 mm
    • require at least 5 tracks passed criteria listed above

      
    Fig 1. Typical spectra for some of the cut parameters for W-stream pp 500 events

    Tracks passing selection are approximated by straight lines in the vicinity of DCA to X=Y=0 and shown in Fig 2. Z-axis range  is always 6 cm, centered at the max likelihood of PPV.
    The following encoding was added to plots:
    *head of arrows indicates direction of the momentum vector 
    *size of the arrow is proportional to track PT, max PT for given set of tracks (event) is in the title of the left histograms
    * thickens of the line is proportional to the weight of  track in vertex (or beam line) determination, I used formula:
           width= 3.* (0.15*0.15)/sig/sig; , where sig=sigYlocal from Sti .
    (The last 2 conditions sometimes interfere, since the thicker line increases also the arrow size, but still plots should help us to gain intuition).


    Fig 2, Projections of global tracks at most likely vertex location. One event per row, two projections: Y vs. X and Y vs. Z.
    Stray tracks are most likely form pileup or from decays matched to fired EMC towers.
    The width of arrows is proportional to likelihood the vertex is below it (~1/track error^2)

    Attachments A,B show more real data events.

    Attachments C,D show M-C Pythia QCD events with partonic pT>10 & 20 GeV, respectively. C has fixed vertex offset, D has varied vertex offset.


    Conclusion:

    *Very often one sees 2 jets what impedes determination of transverse vertex position on event by event basis, in particular if vertex finder is not returning non-diagonal covariance matrix element covXY (see last event in fig 2.)

    * we will pursue alternative method of beam line determination by fitting its equation directly to preselected tracks from multiple events. We try to skip event by event vertex determination.

     

    02 3D beam line fit to tracks, no tilt

    Stand alone 3D beam line fitter developed by Jan & Rosi in June 2009

     

    Fig 2. Example of X0,Y0 fit for pp 500 data F10415, more in att. A)


    Attachment A): slides vetting 3D beam-line  fitting algo

    Attachment B): document describing math used to compute 3D likelihood

    Attachment C: Source code for fitting contains:

     

    * Main program: mainFitBeamLine3D.cxx
    * Likelhood computation:likeFuncBeamLine3D.cxx
    * utility class doing all I/O and controlling likelhood computation, histos: UtilBeamLine3D.cxx, UtilBeamLine3D.h
    * plotting macro: pl3DfitB.C
     

     

    03 2009 PPV upgrade #1

     PPV vertex finder code have been upgraded and default changed as follows.

    (July 9, 2009)

    1) the threshold for pT of  single-matched-track vertices was lowered from 15 GeV/c to 10 GeV/c.
    The purpose if this change is to not loose W-events for the case when TPC calibration is approximate and error of reco PT of TPC track is sizable for tracks with true PT of 20 GeV/c.
    Those vertices will be now more likely pileup contaminated, since there is a fair chance for a random matching of a global track to a fired BTWO tower. Users should use vertices with at least 2 matched tracks which will have rank>1e6.

    2) Additional expert-only functionality was added to PPV , encapsulated in the new class Vertex3D.

    If BFC is run in normal way, e.g. in production no new action is taken.

    However if BFC option "VtxSeedCalG" is added for every event high quality most likely primary tracks candidates from the pool of global tracks:

    • use only global tracks matched to BTOW,ETOW, or central membrane
    • require nFit/nPoss>51%, and Rxy @ DCA to X=Y=0 below 3 cm
    • require PPV finds valid vertex along Z direction, require delZ <3cm
    • require global pT in range [0.8, 8.0] GeV/c

    and printed in to the logfile in the format:

          printf("track4beamLine %f %f %f   %f %f %f   %f %f %f   %d %f  %.1f %d \n",x,y,z,px,py,pz,er->_cYY,er->_cZY,er->_cZZ , tr.nFitPoint,tr.gChi2,z0,eveID);

    and capture via 'grep track4beamLine logFile' command.
     
    Furthermore, a local histogram file named 'ppv' is created with various PPV monitoring histos. Example of plots are shown below. For the first 50 events there are plots of   X_Y & Z_Y projections of those selected high quality tracks in the vicinity of the beam line (Fig 1.) The macro StRoot/StGenericVertexMaker/macros/plPPVtrack4beamLine.C can be helpful in displaying those, see fig 2. 
       

    Fig. 1. prim tracks candidates in the vicinity of beam line


    Fig. 2. QA plots for prim track selection

    STAR Vertex Finder Performance Comparison

    Minuit VF KFV PPV PPV + fitter
    Num Vertices
    Num Tracks per event
    Num Tracks per vertex
    Vertex X
    Vertex Y
    Vertex Z
    Vertex Error X
    Vertex Error Y
    Vertex Error Z

     


    Comparison of total error magnitude of reconstructed vertex position. The data are from a Run13 W simulation without pile-up. The vertex is reconstructed without (left) and with (right) proposed PPV fitter

    PPV as is PPV w/ fitter

     


    PPV KFV Minuit

    Simulation

    Welcome to the Simulation Pages!

    Please note that most of the material posted before October 2006 is located at the older web site which we'll keep for reference, for the time being. See sections below for most recent additions and information.

    For making a new simulation production request, please consult the STAR Simulations Requests interface.

     

     

    Adding a New Detector to STAR

    The STAR Geometry Model

    The STAR Geometry is implented in geant 3, which provides the geometry description to STAR's Monte Carlo application, starsim.   The geant3 model is
    implemented using the Advanced Geometry Interface for GSTAR language.   AGI provides a flexible and robust framework in which detector
    geometries can be quickly implemented.  STAR is currently migrating from the AGI language to a related framework called AgML.  AgML stands for
    "Another Geometry Modelling Language."  It is based on XML, and is the preferred language in which new geomtries should be implemented.   AgML
    provides backwards compatability with the AGI language, in order to continue supporting the starsim application as we transition to a new STAR virtual
    Monte Carlo application. 

    Geometry Definition

    Users wishing to develop and integrate new detector models into the STAR framework will be intersted in the following links:

    Tracking Interface (Stv)

    Exporting detector hits

    1. Implement a hit class based on StEvent/StHit
    2. Implement a hit collection
    3. Implement an iterator over your hit collection based on StEventUtilities/StHitIter
    4. Add your hit iterator to the StEventUtitlies/StEventHitIter
       

     

    Implementing a custom seed finder

    ID Truth

    ID truth is an ID which enables us to determine which simulated particle was principally responsible for the creation of a hit in a detector, and eventually the physics objects (clusters, points, tracks) which are formed from them.  The StHit class has a member function which takes two arguements:

    • idTru -- the primary key of the simulated particle, i.e. "track_p" in the g2t hit structure
    • qaTru -- the quality of the truth value, defined as the dominant contributor to the hit, cluster, point or track.
    Implementation of ID truth begins in the slow simulator.  Here, every hit which is created should have an ID truth value assigned. 

    When hits are built up into clusters, the clustering algorithm should set the idtruth value for the cluster based on the dominant contributor of the hits which make up the cluster.

    When clusters are associated into space points, the point finding algorithm should set the idtruth value for the point.  In the event that two clusters are combined with two different idTruth values, you should set idTruth = 0.


    Interface to Starsim

    The interface between starsim and reconstruction is briefly outlined here

    • You do not have access to view this node 

    Information about geometries used in production and which geometries to use in simulations may be found in the following links:

    • Existing Geometry Tags used in Production
    • The STAR Geometry in simulation & reconstruction contains useful information on the detector configurations associated with a unique geometry tag.  Production geometry tags state the year for which the tag is valid, and a letter indicating the revision level of the geometry.  For example, "y2009c" indicates the third revision of the 2009 configuration of the STAR detector.  Users wishing to run starsim in their private areas are encouraged to use the most recent revision for the year in which they want to compare to data.

    Comparisons between the original AgSTAR model and the new AgML model of the detector may be found here:

    AgML Project Overview and Readiness for 2012




    HOWTO Use Geometries defined in AgML in STARSIM
    AgML geometries are available for use in simulation using the "eval" libraries. 
    $ starver eval
    The geometries themselves are available in a special library, which is setup for backwards compatability with starsim.  To use the geometries you load the "xgeometry.so" library in a starsim session, either interactively or in a macro:
    starsim> detp geom y2012

    starsim> gexe $STAR_LIB/xgeometry.so
    starsim> gclos all
     
    HOWTO Use Geometries defined in AgML in the Big Full Chain
    AgML geometries may also be used in reconstruction.  To access them, the "agml" flag should be provided in the chain being run:
    e.g
     
    root [0] .L bfc.C
    root [1] bfc(nevents,"y2012 agml ...", inputFile);

     

    Geometry in Preparation: y2012

    Major changes:

    1. Support cone, ftpc, ssd, pmd removed.
    2. Inner Detector Support Module (IDSM) added                                                                  
    3. Forward GEM Tracker (FGTD) added
     
    Use of AgML geometries within starsim:
     
    $ starver eval
    $ starsim
    starsim> detp geom y2012
    starsim> gexe $STAR_LIB/xgeometry.so
    starsim> gclos all
     
    Use of AgML geometries within the big full chain:
    $ root4star
    root [0] .L bfc.C
    root [1] bfc(0,"y2012 agml ...",inputFile);
     

    Current (10/24/2011) configuration of the IDSM with FGT inside --

     

     

    Page maintained by Jason Webb <jwebb@bnl.gov>

     

    AgML Example: The Beam Beam Counters

    1. <Document  file="StarVMC/Geometry/BbcmGeo/BbcmGeo.xml">
    2. <!--
    3.  Every AgML document begins with a Document tag, which takes a single "file"
    4.  attribute as its arguement.
    5.  
    6.  -->
    7.  
    8.  
    9. <Module name="BbcmGeo" comment=" is the Beam Beam Counter Modules GEOmetry "  >
    10. <!--
    11.  The Module tag declares an AgML module.  The name should consist of a four
    12.  letter acronym, followed by the word "geo" and possibly a version number.
    13.  
    14.  e.g. BbcmGeo, EcalGeo6, TpceGeo3a, etc...
    15.  
    16.  A mandatory comment attribute provides a short description of which detector
    17.  is implemented by the module.
    18.  
    19.  -->
    20.  
    21.   <Created date="15 march 2002"   />
    22.   <Author  name="Yiqun Wang"      />
    23.   <!-- The Created and Author tags accept a free-form date and author, for the
    24.       purposes of documentation. -->
    25.  
    26.  
    27.   <CDE>AGECOM,GCONST,GCUNIT</CDE>
    28.   <!-- The CDE tag provides some backwards compatability features with starsim.
    29.       AGECOM,GCCONST and GCUNIT are fine for most modules. -->
    30.        
    31.   <Content>BBCM,BBCA,THXM,SHXT,BPOL,CLAD</Content>
    32.   <!-- The Content tag should declare the names of all volumes which are
    33.       declared in the detector module.  A comma-separated list.  -->
    34.        
    35.   <Structure name="BBCG"  >
    36.     <var name="version"  />
    37.     <var name="onoff(3)" />
    38.     <var name="zdis(2)"  />
    39.   </Structure>
    40.   <!-- The structure tag declares an AgML structure.  It is similar to a c-
    41.       struct, but has some important differences which will be illustrated
    42.       later.  The members of a Structure are declared using the var tag.  By
    43.       default, the type of a var will be a float.
    44.  
    45.       Arrays are declared by enclosing the dimensions of the array in
    46.       parentheses.  Only 1D and 2D arrayes are supported.  e.g.
    47.  
    48.       <var name="x(3)"     />   allowed
    49.       <var name="y(3,3)"   />   allowed
    50.       <var name="z(4,4,4)" />   not allowed
    51.  
    52.       Types may be declared explicitly using the type parameter as below.  
    53.       Valid types are int, float and char.  char variables should be limited
    54.       to four-character strings for backwards compatability with starsim.  
    55.       Arrays of chars are allowed, in which case you may treat the variable
    56.       as a string of length Nx4, where N is the dimension of the array.
    57.  
    58.       -->
    59.        
    60.   <Structure name="HEXG">
    61.     <var name="type"    type="float"  />
    62.     <var name="irad"    type="float"  />
    63.     <var name="clad"    type="float"  />
    64.     <var name="thick"   type="float"  />
    65.     <var name="zoffset" type="float"  />
    66.     <var name="xoffset" type="float"  />
    67.     <var name="yoffset" type="float"  />
    68.   </Structure>
    69.        
    70.   <varlist type="float">
    71.      actr,srad,lrad,ztotal,x0,y0,theta0,phi0,xtrip,ytrip,rtrip,thetrip,rsing,thesing
    72.   </varlist>
    73.   <!-- The varlist tag allows you to declare a list of variables of a stated type.
    74.       The variables will be in scope for all volumes declared in the module.
    75.  
    76.       Variables may be initialized using the syntax
    77.            var1/value1/ , var2/value2/, var3, var4/value4/ ...
    78.  
    79.       Arrays of 1 or 2 dimensions may also be declared.  The Assign tag may
    80.       be used to assign values to the arrays:
    81.  
    82.       <Assign var="ARRAY" value="{1,2,3,4}" />
    83.       -->
    84.        
    85.   <varlist type="int">I_trip/0/,J_sing/0/</varlist>
    86.        
    87.   <Fill  name="BBCG"    comment="BBC geometry">
    88.     <var name="Version" value="1.0"              comment=" Geometry version "  />
    89.     <var name="Onoff"   value="{3,3,3}"          comment=" 0 off, 1 west on, 2 east on, 3 both on: for BBC,Small tiles,Large tiles "  />
    90.     <var name="zdis"    value="{374.24,-374.24}" comment=" z-coord from center in STAR (715/2+6*2.54+1=373.8) "  />
    91.   </Fill>
    92.   <!-- The members of a structure are filled inside of a Fill block.  The Fill
    93.       tag specifies the name of the structure being filled, and accepts a
    94.       mandatory comment for documentation purposes.
    95.  
    96.       The var tag is used to fill the members of the structure.  In this
    97.       context, it accepts three arguements:  The name of the structure member,
    98.       the value which should be filled, and a mandatory comment for
    99.       documentation purposes.
    100.      
    101.       The names of variables, structures and structure members are case-
    102.       insensitive.
    103.  
    104.       1D Arrays are filled using a comma separated list of values contained in
    105.       curly brackets...
    106.  
    107.       e.g. value="{1,2,3,4,5}"
    108.  
    109.       2D Arrays are filled using a comma and semi-colon separated list of values
    110.  
    111.       e.g. value="{11,12,13,14,15;        This fills an array dimensioned
    112.                    21,22,23,24,25;        as A(3,5)
    113.                    31,32,33,34,35;}"
    114.  
    115.       -->
    116.        
    117.  
    118.   <Fill name="HEXG" comment="hexagon tile geometry"  >
    119.     <var name="Type"    value="1"     comment="1 for small hex tile, 2 for large tile "  />
    120.     <var name="irad"    value="4.174" comment="inscribing circle radius =9.64/2*sin(60)=4.174 "  />
    121.     <var name="clad"    value="0.1"   comment="cladding thickness "  />
    122.     <var name="thick"   value="1.0"   comment="thickness of tile "  />
    123.     <var name="zoffset" value="1.5"   comment="z-offset from center of BBCW (1), or BBCE (2) "  />
    124.     <var name="xoffset" value="0.0"   comment="x-offset center from beam for BBCW (1), or BBCE (2) "  />
    125.     <var name="yoffset" value="0.0"   comment="y-offset center from beam for BBCW (1), or BBCE (2) "  />
    126.   </Fill>
    127.        
    128.   <Fill name="HEXG" comment="hexagon tile geometry"  >
    129.     <var name="Type"    value="2"      comment="1 for small hex tile, 2 for large tile "  />
    130.     <var name="irad"    value="16.697" comment="inscribing circle radius (4x that of small one) "  />
    131.     <var name="clad"    value="0.1"    comment="cladding of tile "  />
    132.     <var name="thick"   value="1.0"    comment="thickness of tile "  />
    133.     <var name="zoffset" value="-1.5"   comment="z-offset from center of BBCW (1), or BBCE (2) "  />
    134.     <var name="xoffset" value="0.0"    comment="x-offset center from beam for BBCW (1), or BBCE (2) "  />
    135.     <var name="yoffset" value="0.0"    comment="y-offset center from beam for BBCW (1), or BBCE (2) "  />
    136.   </Fill>
    137.        
    138.   <Use struct="BBCG"/>
    139.   <!-- An important difference between AgML structures and c-structs is that
    140.       only one instance of an AgML structure is allowed in a geometry module,
    141.       and there is no need for the user to create it... it is automatically
    142.       generated.  The Fill blocks store multiple versions of this structure
    143.       in an external name space.  In order to access the different versions
    144.       of a structure, the Use tag is invoked.
    145.      
    146.       Use takes one mandatory attribute: the name of the structure to use.  
    147.       By default, the first set of values declared in the Fill block will
    148.       be loaded, as above.
    149.  
    150.       The Use tag may also be used to select the version of the structure
    151.       which is loaded.
    152.  
    153.       Example:
    154.          <Use struct="hexg" select="type" value="2" />
    155.  
    156.       The above example loads the second version of the HEXG structure
    157.       declared above.
    158.      
    159.       NOTE: The behavior of a structure is not well defined before the
    160.             Use operator is applied.
    161.      
    162.       -->
    163.  
    164.  
    165.   <Print level="1" fmt="'BBCMGEO version ', F4.2"  >
    166.     bbcg_version  
    167.   </Print>
    168.   <!-- The Print statement takes a print "level" and a format descriptor "fmt".  The
    169.       format descriptor follows the Fortran formatting convention
    170.  
    171.       (n.b. Print statements have not been implemented in ROOT export
    172.             as they utilize fortran format descriptors)
    173.    -->
    174.      
    175.                
    176.   <!-- small kludge x10000 because ROOT will cast these to (int) before computing properties -->
    177.   <Mixture name="ALKAP" dens="1.432"  >
    178.     <Component name="C5" a="12" z="6"  w="5      *10000"  />
    179.     <Component name="H4" a="1"  z="1"  w="4      *10000"  />
    180.     <Component name="O2" a="16" z="8"  w="2      *10000"  />
    181.     <Component name="Al" a="27" z="13" w="0.2302 *10000"  />
    182.   </Mixture>
    183.   <!-- Mixtures and Materials may be declared within the module... this one is not
    184.       a good example, as there is a workaround being used to avoid some issues
    185.       with ROOT vs GEANT compatability. -->
    186.  
    187.  
    188.   <Use struct="HEXG" select="type" value="1 "  />
    189.      srad   = hexg_irad*6.0;
    190.      ztotal = hexg_thick+2*abs(hexg_zoffset);
    191.  
    192.   <Use struct="HEXG" select="type" value="2 "  />
    193.      lrad   = hexg_irad*6.0;
    194.      ztotal = ztotal+hexg_thick+2*abs(hexg_zoffset);  <!-- hexg_zoffset is negative for Large (type=2) -->
    195.  
    196.   <!-- AgML has limited support for expressions, in the sense that anyhing which
    197.       is not an XML tag is passed (with minimal parsing) directly to the c++
    198.       or mortran compiler.  A few things are notable in the above lines.
    199.  
    200.       (1) Lines may be optionally terminated by a ";", but...
    201.       (2) There is no mechanism to break long lines across multiple lines.
    202.       (3) The members of a structure are accessed using an "_", i.e.
    203.  
    204.           hexg_irad above refers to the IRAD member of the HEXG structure
    205.           loaded by the Use tag.
    206.  
    207.       (4) Several intrinsic functions are available: abs, cos, sin, etc...
    208.       -->
    209.  
    210.   <Create block="BBCM"  />
    211.   <!-- The Create operator creates the volume specified in the  "block"
    212.       parameter.  When the Create operator is invoked, execution branches
    213.       to the block of code for the specified volume.   In this case, the
    214.       Volume named BBCM below. -->
    215.  
    216.   <If expr="bbcg_OnOff(1)==1|bbcg_OnOff(1)==3">  
    217.  
    218.     <Placement block="BBCM" in="CAVE"
    219.               x="0"
    220.               y="0"
    221.               z="bbcg_zdis(1)"/>
    222.     <!-- After the volume has been Created, it is positioned within another
    223.         volume in the STAR detector.  The mother volume may be specified
    224.         explicitly with the "in" attribute.
    225.  
    226.         The position of the volume is specified using x, y and z attributes.
    227.  
    228.         An additional attribute, konly, is used to indicate whether or
    229.         not the volume is expected to overlap another volume at the same
    230.         level in the geometry tree.  konly="ONLY" indicates no overlap and
    231.         is the default value.  konly="MANY" indicates overlap is possible.
    232.  
    233.         For more info on ONLY vs MANY, consult the geant 3 manual.        
    234.         -->
    235.  
    236.   </If>
    237.        
    238.   <If expr="bbcg_OnOff(1)==2|bbcg_OnOff(1)==3"  >
    239.     <Placement block="BBCM" in="CAVE"
    240.               x="0"
    241.               y="0"
    242.               z="bbcg_zdis(2)">
    243.       <Rotation alphay="180"  />
    244.     </Placement>            
    245.     <!-- Rotations are specified as additional tags contained withn a
    246.         Placement block of code.  The translation of the volume will
    247.         be performed first, followed by any rotations, evaluated in
    248.         the order given. -->
    249.  
    250.  
    251.   </If>
    252.        
    253.   <Print level="1" fmt="'BBCMGEO finished'"></Print>
    254.        
    255.  
    256. <!--
    257.  
    258.  Volumes are the basic building blocks in AgML.  The represent the un-
    259.  positioned elements of a detector setup.  They are characterized by
    260.  a material, medium, a set of attributes, and a shape.
    261.  
    262.  -->
    263.  
    264.  
    265.  
    266. <!--                      === V o l u m e  B B C M ===                      -->
    267. <Volume name="BBCM" comment="is one BBC East or West module">
    268.  
    269.   <Material  name="Air" />
    270.   <Medium    name="standard"  />
    271.   <Attribute for="BBCM" seen="0" colo="7"  />
    272.   <!-- The material, medium and attributes should be specified first.  If
    273.       ommitted, the volume will inherit the properties of the volume which
    274.       created it.
    275.  
    276.       NOTE: Be careful when you reorganize a detector module.  If you change
    277.             where a volume is created, you potentially change the properties
    278.          which that volume inherits.
    279.  
    280.   -->
    281.  
    282.   <Shape type="tube"
    283.      rmin="0"
    284.      rmax="lrad"
    285.      dz="ztotal/2" />
    286.   <!-- After specifying the material, medium and/or attributes of a volume,
    287.       the shape is specified.  The Shape is the only property of a volume
    288.       which *must* be declared.  Further, it must be declared *after* the
    289.       material, medium and attributes.
    290.  
    291.       Shapes may be any one of the basic 16 shapes in geant 3.  A future
    292.       release will add extrusions and composite shares to AgMl.
    293.  
    294.       The actual volume (geant3, geant4, TGeo, etc...) will be created at
    295.       this point.
    296.       -->
    297.          
    298.   <Use struct="HEXG" select="type" value="1 "  />
    299.  
    300.   <If expr="bbcg_OnOff(2)==1|bbcg_OnOff(2)==3"  >
    301.     <Create    block="BBCA"  />
    302.     <Placement block="BBCA" in="BBCM"
    303.            x="hexg_xoffset"
    304.            y="hexg_yoffset"
    305.            z="hexg_zoffset"/>    
    306.   </If>
    307.    
    308.   <Use struct="HEXG" select="type" value="2 "  />
    309.  
    310.   <If expr="bbcg_OnOff(3)==1|bbcg_OnOff(3)==3"  >
    311.  
    312.     <Create block="BBCA"/>
    313.     <Placement block="BBCA" in="BBCM"
    314.            x="hexg_xoffset"
    315.            y="hexg_yoffset"
    316.            z="hexg_zoffset"/>
    317.      
    318.   </If>
    319.    
    320. </Volume>
    321.  
    322. <!--                      === V o l u m e  B B C A ===                      -->
    323. <Volume name="BBCA" comment="is one BBC Annulus module"  >
    324.   <Material name="Air"  />
    325.   <Medium name="standard"  />
    326.   <Attribute for="BBCA" seen="0" colo="3"  />
    327.   <Shape type="tube" dz="hexg_thick/2" rmin="hexg_irad" rmax="hexg_irad*6.0"  />
    328.  
    329.   x0=hexg_irad*tan(pi/6.0)
    330.   y0=hexg_irad*3.0
    331.   rtrip = sqrt(x0*x0+y0*y0)
    332.   theta0 = atan(y0/x0)
    333.  
    334.   <Do var="I_trip" from="0" to="5"  >
    335.    
    336.     phi0 = I_trip*60
    337.     thetrip = theta0+I_trip*pi/3.0
    338.     xtrip = rtrip*cos(thetrip)
    339.     ytrip = rtrip*sin(thetrip)
    340.      
    341.     <Create block="THXM"  />
    342.     <Placement in="BBCA" y="ytrip" x="xtrip" z="0" konly="'MANY'" block="THXM"  >
    343.       <Rotation thetaz="0" thetax="90" phiz="0" phiy="90+phi0" phix="phi0"  />
    344.     </Placement>
    345.      
    346.      
    347.   </Do>
    348.    
    349.    
    350. </Volume>
    351.  
    352. <!--                      === V o l u m e  T H X M ===                      -->
    353. <Volume name="THXM" comment="is on Triple HeXagonal Module"  >
    354.   <Material name="Air"  />
    355.   <Medium name="standard"  />
    356.   <Attribute for="THXM" seen="0" colo="2"  />
    357.   <Shape type="tube" dz="hexg_thick/2" rmin="0" rmax="hexg_irad*2.0/sin(pi/3.0)"  />
    358.  
    359.   <Do var="J_sing" from="0" to="2"  >
    360.    
    361.     rsing=hexg_irad/sin(pi/3.0)
    362.     thesing=J_sing*pi*2.0/3.0
    363.     <Create block="SHXT"  />
    364.     <Placement y="rsing*sin(thesing)" x="rsing*cos(thesing)" z="0" block="SHXT" in="THXM"  >
    365.     </Placement>
    366.    
    367.    
    368.   </Do>
    369.  
    370. </Volume>
    371.  
    372.  
    373. <!--                      === V o l u m e  S H X T ===                      -->
    374. <Volume name="SHXT" comment="is one Single HeXagonal Tile"  >
    375.   <Material name="Air"  />
    376.   <Medium name="standard"  />
    377.   <Attribute for="SHXT" seen="1" colo="6"  />
    378.   <Shape type="PGON" phi1="0" rmn="{0,0}" rmx="{hexg_irad,hexg_irad}" nz="2" npdiv="6" dphi="360" zi="{-hexg_thick/2,hexg_thick/2}"  />
    379.  
    380.   actr = hexg_irad-hexg_clad
    381.  
    382.   <Create block="CLAD"  />
    383.   <Placement y="0" x="0" z="0" block="CLAD" in="SHXT"  >
    384.   </Placement>
    385.  
    386.   <Create block="BPOL"  />
    387.   <Placement y="0" x="0" z="0" block="BPOL" in="SHXT"  >
    388.   </Placement>
    389.  
    390.  
    391. </Volume>
    392.  
    393.  
    394. <!--                      === V o l u m e  C L A D ===                      -->
    395. <Volume name="CLAD" comment="is one CLADding of BPOL active region"  >
    396.   <Material name="ALKAP"  />
    397.   <Attribute for="CLAD" seen="1" colo="3"  />
    398.   <Shape type="PGON" phi1="0" rmn="{actr,actr}" rmx="{hexg_irad,hexg_irad}" nz="2" npdiv="6" dphi="360" zi="{-hexg_thick/2,hexg_thick/2}"  />
    399.  
    400. </Volume>
    401.  
    402.  
    403. <!--                      === V o l u m e  B P O L ===                      -->
    404. <Volume name="BPOL" comment="is one Bbc POLystyren active scintillator layer"  >
    405.  
    406.   <Material name="POLYSTYREN"  />
    407.   <!-- Reference the predefined material polystyrene -->
    408.  
    409.   <Material name="Cpolystyren" isvol="1"  />
    410.   <!-- By specifying isvol="1", polystyrene is copied into a new material
    411.       named Cpolystyrene.  A new material is introduced here in order to
    412.       force the creation of a new medium, which we change with parameters
    413.       below. -->
    414.  
    415.   <Attribute for="BPOL" seen="1" colo="4"  />
    416.   <Shape type="PGON" phi1="0" rmn="{0,0}" rmx="{actr,actr}" nz="2" npdiv="6" dphi="360" zi="{-hexg_thick/2,hexg_thick/2}"  />
    417.  
    418.   <Par name="CUTGAM" value="0.00008"  />
    419.   <Par name="CUTELE" value="0.001"  />
    420.   <Par name="BCUTE"  value="0.0001"  />
    421.   <Par name="CUTNEU" value="0.001"  />
    422.   <Par name="CUTHAD" value="0.001"  />
    423.   <Par name="CUTMUO" value="0.001"  />
    424.   <Par name="BIRK1"  value="1.000"  />
    425.   <Par name="BIRK2"  value="0.013"  />
    426.   <Par name="BIRK3"  value="9.6E-6"  />
    427.   <!--
    428.    Parameters are the Geant3 paramters which may be set via a call to
    429.    GSTPar.
    430.    -->
    431.  
    432.   <Instrument block="BPOL">
    433.     <Hit meas="tof"  nbits="16" opts="C" min="0" max="1.0E-6" />
    434.     <Hit meas="birk" nbits="0"  opts="C" min="0" max="10"     />
    435.   </Instrument>
    436.   <!-- The instrument block indicates what information should be saved
    437.       for this volume, and how the information should be packed. -->
    438.  
    439. </Volume>
    440.  
    441.  
    442. </Module>
    443. </Document>
    444.  
    445.  

    AgML Tutorials

    Getting started developing geometries for the STAR experiment with AgML.

    Setting up your local environment

    You need to checkout several directories and complie in this order:
    
    $ cvs co StarVMC/Geometry
    $ cvs co StarVMC/StarGeometry
    $ cvs co StarVMC/xgeometry
    $ cvs co pams/geometry
    $ cons +StarVMC/Geometry
    $ cons
    


    This will take a while to compile, during which time you can get a cup of coffee, or do your laundry, etc...

    If you only want to visualize the STAR detector, you can checkout:

    $ cvs co StarVMC/Geometry/macros

    Once this is done you can visualize STAR geometries using the viewStarGeometry.C macro in AgML 1, and the loadAgML.C macro in AgML 2.0.

    $ root.exe
    root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C root [1] nocache=true root [2] viewall=true root [3] viewStarGeometry("y2012") 
    root [0] .L StarVMC/Geometry/macros/loadAgML.C
    root [1] loadAgML("y2016")
    root [2] TGeoVolume *cave = gGeoManager->FindVolumeFast("CAVE");
    root [3] cave -> Draw("ogl");              // ogl uses open GL viewer
    
    
    

    Tutorial #1 -- Creating and Placing Volumes

    Start by firing up your favorite text editor... preferably something which does syntax highlighting and checking on XML documents.  Edit the first tutorial geometries located in StarVMC/Geometry/TutrGeo ...

    $ emacs StarVMC/Geometry/TutrGeo/TutrGeo1.xml
    

    This module illustrates how to create a new detector module, how to create and place a simple volume, and how to create and place multiple copies of that volume.  Next, we need to attach this module to a geometry model in order to visualize it.  Geometry models (or "tags") are defined in the StarGeo.xml file. 
     

    $ emacs StarVMC/Geometry/StarGeo.xml
    

    There is a simple geometry, which only defines the CAVE.  It's the first geometry tag called "black hole".  You can add your detector here...
     

    xxx


    $ root.exe

    root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C root [1] nocache=true root [2] viewStarGeometry("test","TutrGeo1");
    

    The "test" geometry tag is a very simple geometry, implementing only the wide angle hall and the cave.  All detectors, beam pipes, magnets, etc... have been removed.  The second arguement to viewStarGeometry specifies which geometry module(s) are to be built and added to the test geometry.  In this case we add only TutrGeo1.  (A comma-separated list of geometry modules could be provided, if more than one geometry module was to be built).

    Now you can try modifying TutrGeo1.  Feel free to add as many boxes in as many positions as you would like.  Once you have done this, recompile in two steps

    $ cons +StarVMC/Geometry
    $ cons
    

    Tutorial #2 -- A few simple shapes, rotations and reflections

    The second tutorial geometry is in StarVMC/Geometry/TutrGeo/TutrGeo2.xml.  Again, view it using viewStarGeometry.C

    $ root.exe
    root [0] .L viewStarGeometry.C
    root [1] nocache=true
    root [2] viewStarGeometry("test","TutrGeo2")
    

    What does the nocache=true statement do?  It instructs viewStarGeometry.C to recreate the geometry, rather than load it from a root file created the last time you ran the geometry.  By default, if the macro finds a file name "test.root", it will load the geometry from that file to save time.  You don't want this since you know that you've changed the geometry. 

    The second tutorial illustrates a couple more simple shapes:  cones and tubes.  It also illustrates how to create reflections.  Play around with the code a bit, recompile in the normal manner, then try viewing the geometry again.

    Tutorial #3 -- Variables and Structures

    AgML provides variables and structures.  The third tutorial is in StarVMC/Geometry/TutrGeo/TutrGeo3.xml.  Open this up in a text editor and let's look at it.   We define three variables: boxDX, boxDY and boxDZ to hold the dimensions of the box we want to create.  AgML is case-insensitve, so you can write this as boxdx, BoxDY and BOXDZ if you so choose.  In general, choose what looks best and helps you keep track of the code you're writing.

    Next check out the volume "ABOX".  Note how the shape's dx, dy and dz arguements now reference the variables boxDX, boxDY and boxDZ.  This allows us to create multiple versions of the volume ABOX.  Let's view the geometry and see.

    $ root.exe
    root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C
    root [1] nocache=true
    root [2] viewStarGeometry("test","TutrGeo3")
    

    Launch a new TBrowser and open the "test" geometry.  Double click test --> Master Volume --> CAVE --> TUTR.  You now see all of the concrete volumes which have been created by ROOT.  It should look like what you see at the right.  We have "ABOX", but we also have ABO1 and ABO2.  This demonstrates the an important concept in AgML.  Each <Volume ...> block actually defines a volume "factory".  It allows you to create multiple versions of a volume, each differing by the shape of the volume.  When the shape is changed, a new volume is created with a nickname, where the last letter in the volume name is replaced by [1 2 3 ... 0 a b c ... z] (then the second to last letter, then the third...). 

    Structures provide an alternate means to define variables.  In order to populate the members of a structure with values, you use the Fill statement.  Multiple fill statements for a given structure may be defined, providing multiple sets of values.  In order to select a given set of values, the <Use ...> operator is invoked.  In TutrGeo3, we create and place 5 different tubes, using the data stored in the Fill statements.

    However, you might notice in the browser that there are only two concrete instances of the tube being created.  What is going on here?  This is another feature of AgML.  When the shape is changed, AgML will look for another concrete volume with exactly the same shape.  If it finds it, it will use that volume.  If it doesn't, then a new volume is created.

    There's alot going on in this tutorial, so play around a bit with it. 

     

    Tutorial #4 -- Some more shapes

     

    AgML vs AgSTAR Comparison

    Abstract: We compare the AgML and AgSTAR descriptions of recent revisions of the STAR Y2005 through Y2011 geometry models.  We are specifically interested in the suitability of the AgML model for tracking.  We therefore plot the material contained in the TPC vs pseudorapidity for (a) all detectors, (b) the time projection chamber, and (c) the sensitive volumes of the time projection chamber.  We also plot (d) the material found in front of the TPC active volumes. 

    Issues with PhmdGeo.xml
    • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

    Decription of the Plots

    Below you will find four columns of plots, for the highest revision of each geometry from y2005 to the present.  The columns from left-to-right show comparisons of the material budget for STAR and its daughter volumes, the material budgets for the TPC and it's immediate daughter volumes, the material budgets for the active volumes in the TPC, and the material in front of the active volume of the TPC.  In the context of tracking, the right-most column is the most important.

    Each column contains three plots.  The top plot shows the material budget in the AgML model.  The middle plot, the material budget in the AgSTAR model.  The bottom plot shows the difference divided by the AgSTAR model.  The y-axis on the difference plot extends between -2.5% and +2.5%.

     --------------------------------


    Attached you will find much more comprehensive set of plots, contained in a TAR file.  PDF's for every subsystem in each of the following geometry tags are provided.  They show the material budget comparing AgML to AgSTAR for every volume in the subsystem.  They also show a difference plot, equal to the difference divided by the average.  There is a color-coding scheme.  The volume will be coded green if the largest difference is below 1%, red if it exceeds 1% over an extended range.  Yellow indicates a missing (mis-named) volume in one or the other geometry, and orange indicates a 1% difference over a small area (likely the result of roundoff error in alignments of the geometries).


     

     

    STAR Y2011 Geometry Tag

    Issues with TpceGeo3a.xml
    • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
    • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
    Issues with PhmdGeo.xml
    • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.
    (a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

    STAR Y2010c Geometry Tag

    Issues with TpceGeo3a.xml
    • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
    • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
    Issues with PhmdGeo.xml
    • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

     


    (a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

    STAR Y2009c Geometry Tag

    Issues with TpceGeo3a.xml
    • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
    • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
    Issues with PhmdGeo.xml
    • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.
    (a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

    STAR Y2008e Geometry Tag

    Global Issues
    • Upstream areas not included in AgML steering routine.
    Issues with TpceGeo3a.xml
    • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
    • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
    Issues with PhmdGeo.xml
    • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.
    (a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

    STAR Y2007h Geometry Tag

    Global Issues
    • Upstream areas not included in AgML steering routine.
    Issues with TpceGeo3a.xml
    • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
    • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
    Issues with PhmdGeo.xml
    • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

    Issues with SVT.

    (a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

    STAR Y2006g Geometry Tag

    Global Issues
    • Upstream areas not included in AgML steering routine.

    Note: TpceGeo2.xml does not suffer from the overlap issue in TpceGeo3a.xml

    (a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

    STAR Y2005i Geometry Tag

    Global Issues
    • Upstream areas not included in AgML steering routine.
    Issues with TpceGeo3a.xml
    • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
    • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
    Issues with PhmdGeo.xml
    • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

     

    (a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

    AgML vs AgSTAR tracking comparison

    Attached is a comparison of track reconstruction using the Sti tracker, with AgI and AgML geometries as input.

    Interfacing the New Detector with the STAR Big Full Chain

    As STAR gradually comes to the end of its AA heavy ion program and more focus are put on polarized pp/pA physics and future ep/eA project at eRHIC era, many upgrades are foreseen to strengthen the detector capability at forward region. These include both the near-term upgrades for polarized pp program, eg. FMS/FSC/FHC calorimeters and FGT/VFGT tracking, and upgrades for eSTAR in about 5 to 10 years. Different detector concepts exist and optimization is needed to fit them in STAR physics and into the current STAR detector system. To reach a proper solution a lot of Monte Carlo (MC) works will be carried out, especially in the STAR simulation framework for its flexibility, robustness and proven performance during the last decade.

     
    During the last 9 months, I have worked with the colleagues at BNL for developing a new detector concept for eSTAR east-side endcap upgrade to identify reliably the recoiled electrons in ep/eA collisions. This detector design consists of several parts and its functionality need be evaluated in the STAR simulation framework. This procedure, including implementing a new detector into the STAR system and then generating/analyzing the MC data, requires quite a few efforts and collaborations with software experts, at least for a “rookie” (like me, who usually analyzes data without many code development). For the purpose to better arrange my own understanding on this and provide a guide for peoples that will do similar jobs in the future, I write this note based on my experience. Many software experts helped me much in dealing all kinds of problems I met in this work. I hope this guide can also relief their burden, to some extent, from being frequently interrupted by new code developer like me (remember they are already over-occupied to maintain the STAR software environment). Since I’m still not a veteran, this simple note will not contain all pieces but only necessary parts, likely most suitable for beginners only.
     
    - Ming Shao (USTC)
     
     
    Assume you will work on RCF (RACF) because this is the best-maintained place for computing work at STAR. Normally you should work in the evaluation version, since you'll add and tune new detector models which are not part of the current STAR experiment. So please remember type ‘star eval’ before you start. You’d better also create a new directory for this.
     
    > mkdir [your work path]
    > cd [your work path]
    > star eval
     
    First you should get the STAR detector geometry description, and then add or modify your new detector. STAR has switched to a new Extensible Markup Language (XML), called AgML, to describe its detector system. This is done by
     
    cvs co StarVMC/Geometry
    cvs co StarVMC/StarGeometry
    cvs co StarVMC/xgeometry
     
    Then you should create a new directory in StarVMC/Geometry, like all other detectors, with its name representing your new detector (usually ‘XXXXGeo’). For example, I added a new detector geometry named ‘EiddGeo’ into this directory, which is intending to identify electrons (Electron ID Detector). In this directory (XXXXGeo), you can create and further modify your new detector geometry in AgML language. You can find an example - StarVMC/Geometry/BBcmGeo/BbcmGeo.xml, which contains very detailed in-line explanation of this language. You can copy and modify it for your new detector, just following the style. More details about AgML can be found at Jason’s webpage (Jason Webb is the AgML expert).
     
    After you finish your modeling of your new detector, try compile it by type ‘cons’ TWICE in your base work directory (the directory containing the StarVMC directory).
     
    > cons
    > cons
     
    Debug your code if compilation fails until it succeeds all the way to create geometry libraries in the .sl53_gcc432 directory). Then you can check if the geometry of new detector satisfies your expectation by plotting it out. This can be done by modify the macros in the StarVMC/Geometry and execute them in ROOT.
     
    > root.exe (in the base work directory)
    > .L StarVMC/Geometry/macros/viewStarGeometry.C
    > viewStarGeometry.C(“XXXX”)
     
    XXXX is a geometry tag. An exmaple is shown on the right. The geometry of a new detector - EIDD - is plotted along with the STAR-TPC. Other detectors and magnetic system are omitted.
     
    For related macros modification Jason can provide help. Note: Jason must acknowledge your work so he can modify the geometry tag or create a new one for you.
     
    Once the new detector geometry is fine you may want to run some simulation events, based on GEANT. Before you can really do MC simulation via STARSIM (formerly known as GSTAR), you should make sure you have instrumented sensitive detector and hit digitalization in your detector geometry. Then you can initialize STARSIM by type ‘starsim’ in your work directory. In STARSIM you can execute your kumac macro to generate some MC events. Kumac is PAW based macro, and for a simple test run (eg. with file name ‘testrun.kumac’) it may look like this:
     
    MACRO testrun
     DETP geom devE         | devE is a tag for eSTAR simulation
     GEXE .$STAR_HOST_SYS/lib/xgeometry.so   | use local geometry
     GCLO all
     MODE All SIMU 1 | 0: no secondary; 1: default secondary; 2: save all secondary
     GKIN 1 6 2.0 2.0 -1.5 -1.5
    gfile o test1.fzd
     TRIG 1
    RETURN
     
    This macro will generate 1 negative muon from the center of STAR detector with a transverse momentum 2GeV/c, to pseudo-rapidity -1.5. The azimuthal angle is randomly chosen in the range from 0 to 360 degree. The simulated event, with all hits created in the detector system, is then saved into file ‘test1.fzd’. Before you exit STARSIM, you can print out information of this simulation to check your new detector. For example, you can print all hits created in the detector system by
     
    > gprin hits
     
    The STARSIM manual can be found at this URL. A built-in help command in STARSIM can also help in some details. Just type ‘help’ in STARSIM and follow the help instruction. The section 14 (GEANT related commands), 15 (GSTAR user commands) and 16 (advance GEANT user interface) in the help may be especially important to read.
     
    If you find the generated hits reasonable and want to go forward, you need check out and modify necessary codes in the 'pams' directory, where codes transferring GEANT hits from STARSIM to STAR compatible hit type, are contained. You need at your work directory do
     
    > cvs co pams
     
    The codes contained in 'pams/sim' are especially useful for simulation purpose. Several files at different locations are then to be added or changed. You need create your own hit type in pams/sim/idl directory, where all kinds of hit types are defined. The file name is usually g2t_XXX_hit.idl (XXX is a three character name representing your detector). If your hit type is similar to those already exists, you can also just use that hit type (so no new hit type is needed). For example, a new hit type ‘g2t_etr_hit.idl’ was created when I tried to add a new detector (EiddGeo) into STAR.
     
    struct g2t_etr_hit {          /* G2t_etr_hit */
           long      id;         /* primary key */
           long      next_tr_hit_p;/* Id of next hit on same track */
           long      track_p;    /* Id of parent track */
           long      volume_id; /* STAR volume identification */
           float     de;         /* energy deposition at hit */
           float     ds;         /* path length within padrow */
           float     p[3];       /* local momentum */
           float     tof;        /* time of flight */
           float     x[3];       /* coordinate (Cartesian) */
           float     lgam;       /* ALOG10(GEKin/AMass) */
           float     length;     /* track length up to this hit */
           float     adc;        /* signal in ADC after digitization */
           float     pad;        /* hit pad position used in digitization */
           float     timebucket; /* hit time position -"- */
    };
     
    This hit type is basically the same as TPC hit type since their functionality is similar. If you want this hit be associated with a track, you need also modify the g2t_track.idl in the same directory. Just add two line in the g2t_track struct.
     
    long      hit_XXX_p; /* Id of first XXX hit on track linked list */
    and
    long      n_XXX_hit; /* Nhits on XXX */
     
    Several other files necessary to be changed are located at another directory pams/sim/g2t. These files are g2t_XXX.idl, g2t_XXX.F and g2t_volume_id.g (XXX is a three character name representing your detector). g2t_XXX.idl connects g2t_track and your hit type (g2t_XXX_hit.idl) to your detector and g2t_XXX.F implements the actual function. One can refer to codes from detectors with similar functionalities to write your code. For example, g2t_tpc.F is a tracking type detector with energy loss of the track, g2t_tof.F is for timing and g2t_emc.F deals with properties of calorimeter type detector. You should basically follow the style in these example files and just make necessary changes (mostly detector names) related to your detector. For g2t_XXX.idl, an example is
     
    #include "PAM.idl"
    #include "g2t_track.idl"
    #include "g2t_XXX_hit.idl"
    interface g2t_XXX : amiModule{ STAFCV_T call (inout g2t_track g2t_track,
                                                                                         out g2t_XXX_hit g2t_XXX_hit ); };
     
    For g2t_XXX.F, there is one important line in this file ‘call G2R_GET_SYS ('XXXX','YYYY',Iprin,Idigi)’, where XXXX is your detector name already shown in your geometry description ‘XXXXGeo.xml’, and YYYY is the sensitive volume name in your geometry. For example, it’s ‘call G2R_GET_SYS ('EIDD','TABD',Iprin,Idigi)’ in g2t_etr.F, since the corresponding sensitive volume is ‘TABD’ in ‘EiddGeo.xml’.
     
    The g2t_volume_id.g file must also be modified to identify the new detector sensitive volumes, in an unambiguous way. You need provide an unique volume_id for each sensitive volume based on the hit volume id in GEANT, contained in an array numbv. In GEANT, a touchable volume can be uniquely found from its volume architecture. This volume architecture is smartly stored in the array numbv. Unnecessary part of the architecture is omitted provided the sensitive volume can still be located unambiguously. Assume a volume architecture of A containing B, B containing C, C containing D1 and D2, both of which are sensitive. Then numbv only store the volume id of D1 and D2, ie. only 1 number, since A, B and C are the same for D1/D2. However, if B contains another sensitive volume D3 (parallel to C), numbv will contain 2 number for a hit so that D1/D2/D3 can be uniquely identified. You should use the numbers in numbv to form a final volume_id value, eg.
     
    elseif (Csys=='etr') then
       sector  = MOD( (numbv(1)-1), 12 );   "Sectors count from 0 - 11"
       layer    =      (numbv(1)-1)/ 12;           "Layers count from 0 - 2"
       section = numbv(2) - 1;                         "Sections count from 0 - 29"
      volume_id = section + 100*layer + 10000*sector
     
    Please note numbv array element start from 1, not 0. Refer to other volume_id’s in the g2t_volume_id.g file.
     
    When these modifications are successfully done, you need re-compile once again, by just type
     
    > cons
     
    in your work directory. WARNING: when you compile pams with your new detector items for the first time, you’re likely to get errors like “.sl53_gcc432/obj/pams/sim/g2t/St_g2t_XXX_Module.cxx:2:31: error: St_g2t_XXX_Module.h: No such file or directory” (XXX is your new detector). If you try “cons” again, the errors may disappear and your compilation seems to end good. However, there might be hidden vulnerability in your compiled code, which may sometimes cause execution problem. So here is a trick – when you meet such errors, clean your previous compilation first, then do the following in correct order.
     
    > cons +pams/sim
    > cons +pams
    > cons +StarVMC/Geometry
    > cons
     
    In this way you can get correctly compiled code.
      
    From now on, when you generate your simulation events in STARSIM, the correct type of hits in your new detector will be saved in the GEANT output file. However, the GEANT file is not plainly readable. The code in STAR software framework to read these data is the St_geant_Maker, which is always contained in the StRoot modular. You should check it out from the library,
     
    > cvs co StRoot/St_geant_Maker
     
    The major file you need to modify is St_geant_Maker.cxx. Its header file St_geant_Maker.h can be often left as it is. The following changes are necessary.
     
    Add a line ‘#include "g2t/St_g2t_XXX_Module.h"’ to the header part of the file (XXX represents your detector abbreviation, as defined in pams/sim/g2t). You needn’t worry about this header file since it is automatically generated when you compile pams.
     
    Add a part of code to read the hits from your detector to STAR TDataSet. An example to add ‘etr’ hits is shown as below.
     
    nhits = 0;
    geant3 -> Gfnhit("EIDH","TABD", nhits);
    if ( nhits > 0 )
     {
        St_g2t_etr_hit *g2t_etr_hit = new St_g2t_etr_hit("g2t_etr_hit",nhits);
        m_DataSet->Add(g2t_etr_hit);
        iRes = g2t_etr( g2t_track, g2t_etr_hit);
        if ( Debug() > 1 ) g2t_etr_hit->Print(0,10);
     }
     
    Just replace ‘etr’ to your hit type. One attention must be paid to the line ‘geant3 -> Gfnhit("EIDH","TABD", nhits)’. Here EIDH and TABD represent the names of the detector and sensitive volume. However, the actual detector name EIDD is changed to EIDH. This is the protocol – replace the last character of the detector name to ‘H’ (this also means the first 3 characters of your detector name should differ from all other detectors to avoid ambiguity).
     
    The data retrieved from the GEANT files should now be written out for further processing. As a first step, the simulation events are usually stored in StMcEvent, a STAR class dedicated for record MC events. You should start by checking out this class to your work directory.
     
    > cvs co StRoot/StMcEvent
    > cvs co StRoot/StMcEventMaker
     
    The second calss ‘StMcEventMaker’, as its name suggested, is intended to write out all necessary information from GEANT simulation to StMcEvent.
     
    There are several classes in the directory StRoot/StMcEvent for you to add and modify. The first 2 classes are your detector hit definition class and collector class, usually with names like StMcXXXHit and StMcXXXHitCollection, where XXX represents your detector. You can refer to other classes in StRoot/StMcEvent with similar function to your detector, or even just use an existing one if you feel it already contains all information you need. For myself, I create 4 new classes StMcEtrHit, StMcEtrHitCollection, StMcEtfHit and StMcEtfHitCollection, at the same time use 2 existing class StMcCalorimeterHit and StMcEmcHitCollection, to implement my EIDD detector.
     
    Then you should add your hit collector to StMcEvent class. In StMcEvent.hh file, add a line
     
    class StMcXXXHitCollection;
     
    to the header part of this file. Then add your hit collector to StMcEvent class protected member
     
    StMcXXXHitCollection* mXXXHits;
     
    and corresponding ‘Get’ and ‘Set’ methods to public function member, respectively
     
    StMcXXXHitCollection* XXXHitCollection() { return mXXXHits; } 
    const StMcXXXHitCollection* XXXHitCollection() const { return mXXXHits; } 
    void setXXXHitCollection(StMcXXXHitCollection*);
     
    Again here XXX is for your new detector name. Next in StMcEvent.cc file, add include files
     
    #include "StMcXXXHitCollection.hh"
    #include "StMcXXXHit.hh"
     
    to the header part. Add a initialization call
     
    mXXXHits = new StMcXXXHitCollection();
     
    in the function ‘void StMcEvent::makeColls()’. Implement the ‘Set’ method declared in the StMcEvent.hh file
     
    void StMcEvent::setXXXHitCollection(StMcXXXHitCollection* val) 
    {
        if (mXXXHits && mXXXHits!= val) delete mXXXHits;  
        mXXXHits = val;
    }
     
    You may also want to print out your hits in some cases to check if they are OK. So in
     
    void StMcEvent::Print(Option_t *option) const
     
    function, you need a line or more to do this job. Usually it looks like
     
    PrintHeader(Name,name);
    PrintHitCollection(Name,name);
     
    where PrintHeader and PrintHitCollection are C++ macros defined in StMcEvent.cc file, and Name/name represent your detector name. There are more than one such macros so you can choose the one best suits your case.
    The detector hits caused by charged particles are usually related to track class. For simulation events, this is StMcTrack. You may also want to modify this class to have your detector hits in. Similar to StMcEvent class, you need add your hit member in StMcTrack.hh file, as well as ‘Get’, ‘Set’, ‘Add’ and ‘Remove’ methods.
     
    StPtrVecMcXXXHit  mXXXHits;
    StPtrVecMcXXXHit& XXXHits() { return mXXXHits; }
    const StPtrVecMcXXXHit& XXXHits() const { return mXXXHits; }
    void setXXXHits(StPtrVecMcXXXHit&);
    void addXXXHit(StMcXXXHit*);
    void removeXXXHit(StMcXXXHit*);
     
    Then implement them in StMcTrack.cc file. This is quite straightforward - just refer to other detector hit type in this file to see how to do it.
    Besids, don’t forget clear the hits in the destructor StMcTrack::~StMcTrack() by adding a line
     
    mXXXHits.clear();
     
    Other methods you may want to implement are:
     
    ostream& operator<<(ostream& os, const StMcTrack& t),
    void StMcTrack::Print(Option_t *option) const
    const StPtrVecMcHit *StMcTrack::Hits(StDetectorId Id) const
    const StPtrVecMcCalorimeterHit *StMcTrack::CalorimeterHits(StDetectorId Id) const
     
    Neither of them is difficult.
    You may notice that a vector class type is used in the codes above which is not declared before - StPtrVecMcXXXHit. This is done in another class StMcContainer.hh. You should add several lines in this file in the following order.
     
    class StMcXXXHit;
    typedef vector<StMcXXXHit*> StSPtrVecMcXXXHit;
    typedef vector<StMcXXXHit*> StPtrVecMcXXXHit;
    typedef StPtrVecMcXXXHit::iterator StMcXXXHitIterator;
    typedef StPtrVecMcXXXHit::const_iterator StMcXXXHitConstIterator;
     
    Two other classes contains necessary definitions and links you should modify are StMcEventTypes.hh and StMcEventLinkDef.h. In StMcEventTypes.hh add two lines
     
    #include "StMcXXXHit.hh"
    #include "StMcXXXHitCollection.hh"
     
    In StMcEventLinkDef.h, the following lines are to be added.
     
    #pragma link C++ function operator<<(ostream&, const StMcXXXHit&);
    #pragma link C++ typedef StSPtrVecMcXXXHit;
    #pragma link C++ typedef StPtrVecMcXXXHit;
    #pragma link C++ typedef StMcXXXHitIterator;
    #pragma link C++ typedef StMcXXXHitConstIterator;
    #pragma link C++ class vector<StMcXXXHit*>+;
    #pragma link C++ class vector<StMcXXXHit*>+;
     
    Now the basic structure of adding your new detector hits into StMcEvent class is accomplished. It’s time to modify the StMcEventMaker class for your new detector. In the header file you need add a Boolean member
     
    Bool_t doUseXXX;              //!  
     
    If you are adding a new detector of calorimeter type, you’re likely to add a method
     
    void fillXXX(St_g2t_emc_hit*);  (if you just use emc hit type)
    or   
    void fillXXX(St_g2t_XXX_hit*);  (if you use your own calorimeter hit type)
     
    Then in StMcEventMaker.cxx the following places are to be changed.
    Initialize doUseEcl to kTRUE in class constructor.
    In StMcEventMaker::Make() function, add
     
    St_g2t_YYY_hit *g2t_XXX_hitTablePointer = (St_g2t_YYY_hit *) geantDstI("g2t_XXX_hit");
     
    One should pay attention to the type St_g2t_YYY_hit. Here YYY is the hit type (g2t_YYY_hit.idl) you used in pams/sim/g2t_XXX.idl. YYY is not necessary the same as XXX, since you can use existing hit type (YYY) for your new detector (XXX).
     
    Then retrieve the hits by
     
    // XXX Hit Tables
    g2t_YYY_hit_st *XXXHitTable = 0;
    if (g2t_XXX_hitTablePointer)
    XXXHitTable = g2t_XXX_hitTablePointer->GetTable();
    if (Debug()) cerr << "Table g2t_XXX_hit found in Dataset " << geantDstI.Pwd()->GetName() << endl;
       else
          if (Debug()) cerr << "Table g2t_XXX_hit Not found in Dataset " << geantDstI.Pwd()->GetName() << endl;
     
    Then fill the hits, either by AddHits(XXX,XXX,XXX) macro, or your own method fillXXX(St_g2t_emc_hit*) or fillXXX(St_g2t_XXX_hit*). Read carefully the methods if you use the existing StMcCalorimeterHit hit type.
    Now you can try to compile StRoot by
     
    > cons +StRoot
     
    After all the codes above are successfully compiled, you can proceed to run your GEANT data through BFC chain to generate STAR data, such as .McEvent.root file. The BFC options you choose depends on which detectors you want to include. A simple option example can be
     
    Debug,devE,agml,McEvent,NoSvtIt,NoSsdIt,Idst,Tree,logger,genvtx,tags,IdTruth,geantout,big,fzin,McEvOut
     
    However, if you need full simulation of STAR TPC, you need add many more options such as tpcrs and related database.
     
    Further process on McEvent should be similar to all other MC data based analysis, but with your own detector feature. One example you may refer to is the StMiniMcMaker. You can check it out from STAR class library and make modifications to suit your work. As an example, I plot the hit points generated by TRD (in the EIDD) and TPC with a MC negative muon track at fixed momentum and direction. It's shown on the right.
     
    All work above described are based on simulation. If you want to further implement “real” data type StEvent, you need add or change the classes in StRoot/StEvent. StEvent data should generally base on real experiment, such as a beam test on your new detector prototype. However, there may be needs for it since the functionality of some STAR classes rely on StEvent rather than StMcEvent.
     
    Similar to what you have done for StMcEvent, you need add StXXXHit and StXXXHitCollection classes and attach them to StEvent and other relevant classes. Other auxiliary classes such as StContainers, StEnumerations and StDetectorDefinitions also need your modification too. All these classes are contained in StRoot/StEvent directory.
     
    You also need add your own detector maker under StRoot directory. A recent example is StEtrFastSimMaker, which is a simple maker to add the endcap TRD hits to StEvent for further process. It’s a fast simulation maker since more realistic makers should base on experimental data. Victor add this maker just to test Stv track finding with endcap TRD - a more complicated work for experts only.
     

     

    List of Default AgML Materials

    List of default AgML materials and mixtures.  To get a complete list of all materials defined in a geometry, execute AgMaterial::List() in ROOT, once the geometry has been created.

    [-]             Hydrogen:  a=     1.01 z=        1 dens=    0.071 radl=      865 absl=      790 isvol= <unset>  nelem=        1
    [-]            Deuterium:  a=     2.01 z=        1 dens=    0.162 radl=      757 absl=      342 isvol= <unset>  nelem=        1
    [-]               Helium:  a=        4 z=        2 dens=    0.125 radl=      755 absl=      478 isvol= <unset>  nelem=        1
    [-]              Lithium:  a=     6.94 z=        3 dens=    0.534 radl=      155 absl=      121 isvol= <unset>  nelem=        1
    [-]            Berillium:  a=     9.01 z=        4 dens=    1.848 radl=     35.3 absl=     36.7 isvol= <unset>  nelem=        1
    [-]               Carbon:  a=    12.01 z=        6 dens=    2.265 radl=     18.8 absl=     49.9 isvol= <unset>  nelem=        1
    [-]             Nitrogen:  a=    14.01 z=        7 dens=    0.808 radl=     44.5 absl=     99.4 isvol= <unset>  nelem=        1
    [-]                 Neon:  a=    20.18 z=       10 dens=    1.207 radl=       24 absl=     74.9 isvol= <unset>  nelem=        1
    [-]            Aluminium:  a=    26.98 z=       13 dens=      2.7 radl=      8.9 absl=     37.2 isvol= <unset>  nelem=        1
    [-]                 Iron:  a=    55.85 z=       26 dens=     7.87 radl=     1.76 absl=     17.1 isvol= <unset>  nelem=        1
    [-]               Copper:  a=    63.54 z=       29 dens=     8.96 radl=     1.43 absl=     14.8 isvol= <unset>  nelem=        1
    [-]             Tungsten:  a=   183.85 z=       74 dens=     19.3 radl=     0.35 absl=     10.3 isvol= <unset>  nelem=        1
    [-]                 Lead:  a=   207.19 z=       82 dens=    11.35 radl=     0.56 absl=     18.5 isvol= <unset>  nelem=        1
    [-]              Uranium:  a=   238.03 z=       92 dens=    18.95 radl=     0.32 absl=       12 isvol= <unset>  nelem=        1
    [-]                  Air:  a=    14.61 z=      7.3 dens= 0.001205 radl=    30400 absl=    67500 isvol= <unset>  nelem=        1
    [-]               Vacuum:  a=    14.61 z=      7.3 dens=    1e-06 radl= 3.04e+07 absl= 6.75e+07 isvol= <unset>  nelem=        1
    [-]              Silicon:  a=    28.09 z=       14 dens=     2.33 radl=     9.36 absl=     45.5 isvol= <unset>  nelem=        1
    [-]            Argon_gas:  a=    39.95 z=       18 dens=    0.002 radl=    11800 absl=    70700 isvol= <unset>  nelem=        1
    [-]         Nitrogen_gas:  a=    14.01 z=        7 dens=    0.001 radl=    32600 absl=    75400 isvol= <unset>  nelem=        1
    [-]           Oxygen_gas:  a=       16 z=        8 dens=    0.001 radl=    23900 absl=    67500 isvol= <unset>  nelem=        1
    [-]           Polystyren:  a=   11.153 z=    5.615 dens=    1.032 radl= <unset>  absl= <unset>  isvol= <unset>  nelem=        2
                                                                                               A           Z         W
                                                                                        C   12.000      6.000     0.923
                                                                                        H    1.000      1.000     0.077
    [-]         Polyethylene:  a=   10.427 z=    5.285 dens=     0.93 radl= <unset>  absl= <unset>  isvol= <unset>  nelem=        2
                                                                                               A           Z         W
                                                                                        C   12.000      6.000     0.857
                                                                                        H    1.000      1.000     0.143
    [-]                Mylar:  a=    12.87 z=    6.456 dens=     1.39 radl= <unset>  absl= <unset>  isvol= <unset>  nelem=        3
                                                                                               A           Z         W
                                                                                        C   12.000      6.000     0.625
                                                                                        H    1.000      1.000     0.042
                                                                                        O   16.000      8.000     0.333
    
    

    Production Geometry Tags

    This page was merged with STAR Geometry in simulation & reconstruction and maintained by STAR's librarian.

     

     

    Attic

    Retired Simulation Pages kept here.

    Action Items

    Immediate action items:

    •  Y2008 tag
      • find out about the status of the FTPC (can't locate the relevant e-mail now)
      • find out about the status of PMD in 2008 (open/closed)
      • ask Akio about possible updates of the FMS code, get the final version
      • based on Dave's records, add a small amount of material to the beampipe
      • review the tech drawings from Bill and Will and others and start coding the support structure
      • extract information from TOF people about the likely configuration
      • when ready, produce the material profile plots for Y2008 in slices in Z
    • TUP tags
      • work with Jim Thomas, Gerrit and primarily Spiros on the definition of geometry for the next TUP wave
      • coordinate with Spiros, Jim and Yuri a possible repass of the trecent TUP MC data without the IST
    • Older tags
      • check the more recent correction to the SVT code (carbon instead of Be used in the water channels)
      • provide code for the correction for 3 layers of mylar on the beampipe as referred to above in Y2008
      • check with Dave about the dimensions of the water channels (likely incorrect in GEANT)
      • determine which years we will choose to retrofit with improved SVT (ask STAR members)
    • MTD
      • Establish a new UPGRXX tag for the MTD simulation
      • supervise and help Lijuan in extending the filed map
      • provide facility for reading a separate map in starsim and root4star (with Yuri)
    • Misc
      • collect feedback on possible simulation plans for the fall'07
      • revisit the codes for event pre-selection ("hooks")
      • revisit the event mixing scripts
    • Development
      • create a schema to store MC run catalog data with a view to automate job definition (Michael has promised help)

     

    Beampipe support geometry and other news

    Documentation for the beampipe support geometry description development

    After the completion of the 2007 run, the SVT and the SSD were removed from the STAR detector along with there utility lines. The support structure for the beampipe remained, however.

    The following drawings describe the structure of the beampipe support as it exists in the late 2007 and probably throughout 2008

    Further corrections to the SVT geometry model
     
    In the course of recent discussion of the beampipe support and shield material, Dave Lynn has found that even though according to the plans, the material of the cooling water channels in the SVT was specified as Be, in reality carbon composite material was used for that purpose. Below, there are materil vs pseudorapidity plots for the "old" and "new" codes
     
     
     
    It can be seen that the difference is of the order of 0.4% rad. length on top of the existing (roughly) 6%. This is enough grounds for cutting a new version of the geometry and will shall create a geometry tag Y2007A which will reflect such change.
     

    Datasets

    Here we present information about our datasets.

    2005

    Description
    Dataset name
    Statistics, thousands
    Status
    Moved to HPSS
    Comment
    Herwig 6.507, Y2004Y
    rcf1259
    225
    Finished
    Yes
      7Gev<Pt<9Gev
    Herwig 6.507, Y2004Y
    rcf1258
    248
    Finished
    Yes
      5Gev<Pt<7Gev
    Herwig 6.507, Y2004Y
    rcf1257
    367
    Finished
    Yes
      4Gev<Pt<5Gev
    Herwig 6.507, Y2004Y
    rcf1256
    424
    Finished
    Yes
      3Gev<Pt<4Gev
    Herwig 6.507, Y2004Y
    rcf1255
    407
    Finished
    Yes
      2Gev<Pt<3Gev
    Herwig 6.507, Y2004Y
    rcf1254
    225
    Finished
    Yes
      35Gev<Pt<100Gev
    Herwig 6.507, Y2004Y
    rcf1253
    263
    Finished
    Yes
      25Gev<Pt<35Gev
    Herwig 6.507, Y2004Y
    rcf1252
    263
    Finished
    Yes
      15Gev<Pt<25Gev
    Herwig 6.507, Y2004Y
    rcf1251
    225
    Finished
    Yes
      11Gev<Pt<15Gev
    Herwig 6.507, Y2004Y
    rcf1250
    300
    Finished
    Yes
      9Gev<Pt<11Gev
    Hijing 1.382 AuAu 200 GeV minbias, 0< b < 20fm
    rcf1249
    24
    Finished
    Yes
    Tracking,new SVT geo, diamond: 60, +-30cm, Y2005D
    Herwig 6.507, Y2004Y
    rcf1248
    15
    Finished
    Yes
    35Gev<Pt<45Gev
    Herwig 6.507, Y2004Y
    rcf1247
    25
    Finished
    Yes
    25Gev<Pt<35Gev
    Herwig 6.507, Y2004Y
    rcf1246
    50
    Finished
    Yes
    15Gev<Pt<25Gev
    Herwig 6.507, Y2004Y
    rcf1245
    100
    Finished
    Yes
    11Gev<Pt<15Gev
    Herwig 6.507, Y2004Y
    rcf1244
    200
    Finished
    Yes
      9Gev<Pt<11Gev
    CuCu 62.4 Gev, Y2005C
    rcf1243
    5
    Finished
    No
    same as 1242+ keep Low Energy Tracks
    CuCu 62.4 Gev, Y2005C
    rcf1242
    5
    Finished
    No
    SVT tracking test, 10 keV e/m process cut (cf. rcf1237)
    10 J/Psi, Y2005X, SVT out
    rcf1241
    30
    Finished
    No
    Study of the SVT material effect
    10 J/Psi, Y2005X, SVT in
    rcf1240
    30
    Finished
    No
    Study of the SVT material effect
    100 pi0, Y2005X, SVT out
    rcf1239
    18
    Finished
    No
    Study of the SVT material effect
    100 pi0, Y2005X, SVT in
    rcf1238
    20
    Finished
    No
    Study of the SVT material effect
    CuCu 62.4 Gev, Y2005C
    rcf1237
    5
    Finished
    No
    SVT tracking test, pilot run
    Herwig 6.507, Y2004Y
    rcf1236
    8
    Finished
    No
    Test run for initial comparison with Pythia, 5Gev<Pt<7Gev
    Pythia, Y2004Y
    rcf1235
    100
    Finished
    No
    MSEL=2, min bias
    Pythia, Y2004Y
    rcf1234
    90
    Finished
    No
    MSEL=0,CKIN(3)=0,MSUB=91,92,93,94,95
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1233
    308
    Finished
    Yes
    4<Pt<5, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1232
    400
    Finished
    Yes
    3<Pt<4, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1231
    504
    Finished
    Yes
    2<Pt<3, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1230
    104
    Finished
    Yes
    35<Pt, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1229
    208
    Finished
    Yes
    25<Pt<35, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1228
    216
    Finished
    Yes
    15<Pt<25, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1227
    216
    Finished
    Yes
    11<Pt<15, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1226
    216
    Finished
    Yes
    9<Pt<11, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1225
    216
    Finished
    Yes
    7<Pt<9, MSEL=1, GHEISHA
    Pythia, Y2004Y, sp.2
    (CDF tune A)
    rcf1224
    216
    Finished
    Yes
    5<Pt<7, MSEL=1, GHEISHA
    Pythia special tune2
    Y2004Y, GCALOR
    rcf1223
    100
    Finished
    Yes
    4<Pt<5, GCALOR
    Pythia special tune2
    Y2004Y, GHEISHA
    rcf1222
    100
    Finished
    Yes
    4<Pt<5, GHEISHA
    Pythia special run 3
    Y2004C
    rcf1221
    100
    Finished
    Yes
    ENER 200.0, MSEL 2, MSTP (51)=7,
    MSTP (81)=1, MSTP (82)=1, PARP (82)=1.9,
    PARP (83)=0.5, PARP (84)=0.2,
    PARP (85)=0.33, PARP (86)=0.66,
    PARP (89)=1000, PARP (90)=0.16,
    PARP (91)=1.0, PARP (67)=1.0
    Pythia special run 2
    Y2004C
    (CDF tune A)
    rcf1220
    100
    Finished
    Yes

    ENER 200.0, MSEL 2, MSTP (51)=7,
    MSTP (81)=1, MSTP (82)=4, PARP (82)=2.0,
    PARP (83)=0.5, PARP (84)=0.4,
    PARP (85)=0.9, PARP (86)=0.95,
    PARP (89)=1800, PARP (90)=0.25,
    PARP (91)=1.0, PARP (67)=4.0

    Pythia special run 1
    Y2004C
    rcf1219
    100
    Finished
    Yes
    ENER 200.0, MSEL 2, MSTP (51)=7,
    MSTP (81)=1, MSTP (82)=1, PARP (82)=1.9, PARP (83)=0.5, PARP (84)=0.2,
    PARP (85)=0.33, PARP (86)=0.66,
    PARP (89)=1000, PARP (90)=0.16,
    PARP (91)=1.5, PARP (67)=1.0
    Hijing 1.382 AuAu 200 GeV central
    0< b < 3fm
    rcf1218
    50
    Finished
    Yes
    Statistics enhancement of rcf1209 with
    a smaller diamond: 60, +-30cm, Y2004a
    Hijing 1.382 CuCu 200 GeV minbias
    0< b < 14 fm
    rcf1216
    52
    Finished
    Yes
    Geometry: Y2005x
    Hijing 1.382 AuAu 200 GeV minbias
    0< b < 20 fm
    rcf1215
    100
    Finished
    Yes
    Geometry: Y2004a, Special D decays

    2006

    Description
    Dataset name
    Statistics, thousands
    Status
    Moved to HPSS
    Comment
    AuAu 200 GeV central

    rcf1289

    1

    Finished

    No

    upgr06: Hijing, D0 and superposition
    AuAu 200 GeV central

    rcf1288

    0.8

    Finished

    No

    upgr11: Hijing, D0 and superposition
    AuAu 200 GeV min bias

    rcf1287

    5

    Finished

    No

    upgr11: Hijing, D0 and superposition
    AuAu 200 GeV central

    rcf1286

    1

    Finished

    No

    upgr10: Hijing, D0 and superposition
    AuAu 200 GeV min bias

    rcf1285

    6

    Finished

    No

    upgr10: Hijing, D0 and superposition
    AuAu 200 GeV central

    rcf1284

    1

    Finished

    No

    upgr09: Hijing, D0 and superposition
    AuAu 200 Gev min bias

    rcf1283

    6

    Finished

    No

    upgr09: Hijing, D0 and superposition
    AuAu 200 GeV min bias

    rcf1282

    38

    Finished

    No

    upgr06: Hijing, D0 and superposition
    AuAu 200 GeV min bias

    rcf1281

    38

    Finished

    Yes

    upgr08: Hijing, D0 and superposition
    AuAu 200 GeV min bias

    rcf1280

    38

    Finished

    Yes

    upgr01: Hijing, D0 and superposition
    AuAu 200 GeV min bias

    rcf1279

    38

    Finished

    Yes

    upgr07: Hijing, D0 and superposition
    Extension of 1276: D0 superposition
    rcf1278
    5
    Finished

    No

    upgr07: Z cut=+-300cm
    AuAu 200 GeV min bias
    rcf1277
    Finished
    No
    upgr05: Z cut=+-300cm
    AuAu 200 GeV min bias
    rcf1276
    35
    Finished
    No
    upgr05: Hijing, D0 and superposition
    Pythia 200 GeV + HF
    rcf1275
    23*4
    Finished
    No
    J/Psi and Upsilon(1S,2S,3S) mix for embedding
    AuAu 200 GeV min bias
    rcf1274
    10
    Finished
    No
    upgr02 geo tag, |eta|<1.5 (tracking upgrade request)
    Pythia 200 GeV
    rcf1273
    600
    Finished
    Yes
    Pt <2 (Completing the rcf1224-1233 series)
    CuCu 200 GeV min bias+D0 mix
    rcf1272
    50+2*50*8
    Finished
    Yes
    Combinatorial boost of rcf1261, sigma: 60, +-30
    Pythia 200 GeV
    rcf1233
    300
    Finished
    Yes
    4< Pt <5 (rcf1233 extension)
    Pythia 200 GeV
    pds1232
    200
    Finished
    Yes
    3< Pt <4 (rcf1232 clone)
    Pythia 200 GeV
    pds1231
    240
    Finished
    Yes
    2< Pt <3 (rcf1231 clone)
    Pythia 200 GeV
    rcf1229
    200
    Finished
    Yes
    25< Pt <35 (rcf1229 extension)
    Pythia 200 GeV
    rcf1228
    200
    Finished
    Yes
    15< Pt <25 (rcf1228 extension)
    Pythia 200 GeV
    rcf1227
    208
    Finished
    Yes
    11< Pt <15 (rcf1227 extension)
    Pythia 200 GeV
    rcf1226
    200
    Finished
    Yes
    9< Pt <11 (rcf1226 extension)
    Pythia 200 GeV
    rcf1225
    200
    Finished
    Yes
    7< Pt <9 (rcf1225 extension)
    Pythia 200 GeV
    rcf1224
    212
    Finished
    Yes
    5< Pt <7 (rcf1224 extension)
    Pythia 200 GeV Y2004Y CDF_A
    rcf1271
    120
    Finished
    Yes
    55< Pt <65
    Pythia 200 GeV Y2004A CDF_A
    rcf1270
    120
    Finished
    Yes
    45< Pt <55
    CuCu 200 GeV min bias
    rcf1266
    10
    Finished
    Yes
    SVT study: clams and two ladders
    CuCu 200 GeV min bias
    rcf1265
    10
    Finished
    Yes
    SVT study: clams displaced
    CuCu 200 GeV min bias
    rcf1264
    10
    Finished
    Yes
    SVT study: rotation of the barrel
    CuCu 62.4 GeV min bias+D0 mix
    rcf1262
    50*3
    Finished
    Yes
    3 subsets: Hijing, single D0, and the mix
    CuCu 200 GeV min bias+D0 mix
    rcf1261
    50*3
    Finished
    No
    3 subsets: Hijing, single D0, and the mix
    1 J/Psi over 200GeV minbias AuAu
    rcf1260
    10
    Finished
    No
    J/Psi mixed with 200GeV AuAu Hijing Y2004Y 60/35 vertex

    2007

    Unless stated otherwise, all pp collisions are modeled with Pythia, and all AA collisions with Hijing. Statistics is listed in thousands of events. Multiplication factor in some of the records refelcts the fact that event mixing was done for a few types of particles, on the same base of original event files.

    Name System/Energy Statistics Status HPSS Comment Site
    rcf1290 AuAu200 0<b<3fm, Zcut=5cm 32*5 Done Yes Hijing+D0+Lac2+D0_mix+Lac2_mix rcas
    rcf1291 pp200/UPGR07/Zcut=10cm 10 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
    rcf1292 pp500/UPGR07/Zcut=10cm 10 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
    rcf1293 pp200/UPGR07/Zcut=30cm 205 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
    rcf1294 pp500/UPGR07/Zcut=30cm 10 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
    rcf1295 AuAu200 0<b<20fm, Zcut=30cm 20 Done Yes QA run for the Y2007 tag rcas
    rcf1296 AuAu200 0<b<3fm, Zcut=10cm 100*5 Done Yes Hijing,B0,B+,B0_mix,B+_mix, Y2007 rcas
    rcf1297 AuAu200 0<b<20fm, Zcut=300cm 40 Done Yes Pile-up simulation in the TUP studies, UPGR13 rcas
    rcf1298 AuAu200 0<b<3fm, Zcut=15cm 100*5 Done Part Hijing,D0,Lac2,D0_mix,Lac2_mix, UPGR13 rcas
    rcf1299 pp200/Y2005/Zcut=50cm 800 Done Yes Pythia, photon mix, pi0 mix rcas
    rcf1300 pp200/UPGR13/Zcut=15cm 100 Done No Pythia, MSEL=4 (charm) rcas
    rcf1301 pp200/UPGR13/Zcut=300cm 84 Done No Pythia, MSEL=1, wide vertex rcas
    rcf1302 pp200 Y2006C 120 Done No Pythia for Spin PWG, Pt(45,55)GeV rcas
    rcf1303 pp200 Y2006C 120 Done No Pythia for Spin PWG, Pt(35,45)GeV rcas
    rcf1304 pp200 Y2006C 120 Done No Pythia for Spin PWG, Pt(55,65)GeV rcas
    rcf1296 Upsilon S1,S2,S3 + Hijing 15*3 Done No Muon Telescope Detector, ext.of 1296 rcas
    rcf1306 pp200 Y2006C 400 Done Yes Pythia for Spin PWG, Pt(25,35)GeV rcas
    rcf1307 pp200 Y2006C 400 Done Yes Pythia for Spin PWG, Pt(15,25)GeV rcas
    rcf1308 pp200 Y2006C 420 Done Yes Pythia for Spin PWG, Pt(11,15)GeV rcas
    rcf1309 pp200 Y2006C 420 Done Yes Pythia for Spin PWG, Pt(9,11)GeV rcas
    rcf1310 pp200 Y2006C 420 Done Yes Pythia for Spin PWG, Pt(7,9)GeV rcas
    rcf1311 pp200 Y2006C 400 Done Yes Pythia for Spin PWG, Pt(5,7)GeV rcas
    rcf1312 pp200 Y2004Y 544 Done No Di-jet CKIN(3,4,7,8,27,28)=7,9,0.0,1.0,-0.4,0.4 rcas
    rcf1313 pp200 Y2004Y 760 Done No Di-jet CKIN(3,4,7,8,27,28)=9,11,-0.4,1.4,-0.5,0.6 rcas
    rcf1314 pp200 Y2004Y 112 Done No Di-jet CKIN(3,4,7,8,27,28)=11,15,-0.2,1.2,-0.6,-0.3 Grid
    rcf1315 pp200 Y2004Y 396 Done No Di-jet CKIN(3,4,7,8,27,28)=11,15,-0.5,1.5,-0.3,0.4 Grid
    rcf1316 pp200 Y2004Y 132 Done No Di-jet CKIN(3,4,7,8,27,28)=11,15,0.0,1.0,0.4,0.7 Grid
    rcf1317 pp200 Y2006C 600 Done Yes Pythia for Spin PWG, Pt(4,5)GeV Grid
    rcf1318 pp200 Y2006C 690 Done Yes Pythia for Spin PWG, Pt(3,4)GeV Grid
    rcf1319 pp200 Y2006C 690 Done Yes Pythia for Spin PWG, Minbias Grid
    rcf1320 pp62.4 Y2006C 400 Done No Pythia for Spin PWG, Pt(4,5)GeV Grid
    rcf1321 pp62.4 Y2006C 250 Done No Pythia for Spin PWG, Pt(3,4)GeV Grid
    rcf1322 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(5,7)GeV Grid
    rcf1323 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(7,9)GeV Grid
    rcf1324 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(9,11)GeV Grid
    rcf1325 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(11,15)GeV Grid
    rcf1326 pp62.4 Y2006C 200 Running No Pythia for Spin PWG, Pt(15,25)GeV Grid
    rcf1327 pp62.4 Y2006C 200 Running No Pythia for Spin PWG, Pt(25,35)GeV Grid
    rcf1328 pp62.4 Y2006C 50 Running No Pythia for Spin PWG, Pt(35,45)GeV Grid

    2009

     

    qqqqqqqqqqqqqqqqqqqqqqqqqqq
    Name   SystemEnergy
    Range   Statistics  Comment
     rcf9001  pp200, y2007g 03_04gev  690k  Jet Study AuAu200(PP200) JLC PWG
     rcf9002   04_05gev  686k  
     rcf9003   05_07gev  398k  
     rcf9004   07_09gev  420k  
     rcf9005   09_11gev  412k  
     rcf9006   11_15gev  420k  
     rcf9007   15_25gev  397k  
     rcf9008   25_35gev  400k  
     rcf9009   35_45gev  120k  
     rcf9010   45_55gev  118k  
     rcf9011   55_65gev  120k  
             

     

      Name   SystemEnergy  Range  Statistics        Comment
     rcf9021 pp200,y2008       03_04 GeV  690k  Jet Study AuD200(PP200) JLC PWG
     rcf9022    04_05 GeV  686k  
     rcf9023    05_07 GeV  398k  
     rcf9024    07_09 GeV  420k  
     rcf9025    09_11 GeV  412k  
     rcf9026    11_15 GeV  420k  
     rcf9027    15_25 GeV  397k  
     rcf9028    25_35 GeV  400k  
     rcf9029    35_45 GeV  120k  
     rcf9030    45_55 GeV  118k  
     rcf9031    55_99 GeV  120k  

     

     Name  SystemEnergy   Range   Statistics      Comment  
     rcf9041    PP500, Y2009  03_04gev  500k Spin Study PP500 Spin group(Matt,Jim,Jan) 2.3M evts
     rcf9042   04_05gev  500k  
     rcf9043   05_07gev  300k  
     rcf9044   07_09gev  250k  
     rcf9045   09_11gev  200k  
     rcf9046   11_15gev  100k  
     rcf9047   15_25gev  100k  
     rcf9048   25_35gev  100k  
     rcf9049   35_45gev  100k  
     rcf9050   45_55gev    25k  
     rcf9051   55_99gev    25k  
             
     rcf9061  CuCu200,y2005h  B0_14  200k CuCu200 radiation length budget, Y.Fisyak, KyungEon Choi.
     rcf9062 AuAu200, y2007h  B0_14  150k AuAu200 radiation length budget  Y.Fisyak ,KyungEon Choi

     

    2010

    Information on Monte Carlo Data Samples

     
    You do not have access to view this node
     
    Geometry y2009a
    Library SL09g
    Generator Pythia 6.4.22
    Tune 320
    Field -5.0
    ETA -10 < η < +10
    PHI -π < φ < +π
    vertex 0, 0, -2
    width 0.015, 0.015, 42.0
      
    Sample  Channel Events
    rcf10000 W+ → e+ nu 10k
    rcf10001 W- → e- nu 6k
    rcf10002 W+ → tau+ nu
    W- → tau- nu
    10k
    rcf10003 pp → W+/- + jet 10k
    rcf10004 Z e+e-, no Z/gamma interference 4k
    rcf10005 Z all but e+e- 10k
    rcf10006 QCD w/ partonic pT > 35 GeV 100k

     

    Geometry Tag Options

     This page documents the options in geometry.g which define each of the production tags.

     This page documents the options in geometry.g which define each of the production tags.

     This page documents the options in geometry.g which define each of the production tags.

     This page documents the options in geometry.g which define each of the production tags.

     This page documents the options in geometry.g which define each of the production tags.

     This page documents the options in geometry.g which define each of the production tags.

    Geometry Tag Options II

    The attached spreadsheets document the production tags in STARSIM on 11/30/2009.  At that time the y2006h and y2010 tags were in development and not ready for production.

    Material Balance Histograms

    .

    Y2008a

     y2008a full and TPC only material histograms

     

    y2008aStar

     

    1 2

     

    y2008aTpce

     

     

    y2005g

     

     .

     

    y2005gStar

     

    2
    3

     

    y2005gTpce

     

     

    y2008yf

    .

    y2008yfStar

     

    111
      `

     

    y2008yfTpce

     

    1

     

    y2009

    .

    y2009Star

     

    .
    .

     

    y2009Tpce

     

    .

     

    STAR AgML Geometry Comparison with STARSIM/AgSTAR

    STAR Geometry Comparison: AgML vs AgSTAR

    At the left is a general status for each geometry tag which compiles in AgML.  All volumes are tested recursively except for the "IBEM" and similar support structures for the VPD, and the Endcap SMD strips.  (The ESMD planes are tested as a unit, rather than test all 2*12*288 SMD strips).

    Color codes:

    Green: No differences larger than 1%
    
    Yellow: The volume did not appear in AgSTAR geometry
    
    Orange: Difference was larger than 1%, but absolute difference is absolutely negligible.
    
    Red: A difference larger than 1% was detected for a significant amount of material; or a negligible but widespread difference was detected. 
    

    At the right is a PDF file for each geometry tag. For each volume we show two plots. The top plot shows the absolute number of radiation lengths which a geantino encounters traversing the geometry, starting at the geometry and following a straight line at the given pseudorapidity. We average over all phi. The left (right) hashes show the AgML (AgSTAR) geometry. The difference (expressed as a fractional value) of the two histograms is shown the lower plot. Frequently the differences are small, e.g. 10^-6, and ROOT rescales the plots accordingly. Since it is difficult to read the scales of so many plots at once, we have color coded the plots. (Coding seems to fail in the generation of some histograms)... The meaning of the color coding is summarized below.

    <?php
    /********************************************************************** START OF PHP */

    /* =======================================================
       Helper function to show the status_yXXXX.png
       ======================================================= */
    function showImage( $tag, $dir ) {

       echo
    "<img src=\"$dir/status_$tag.png\" />";

    }

    /* =======================================================
       Helper function to show the PDF file
       ======================================================= */
    function showGoogle( $tag, $dir ) {
      
    /*
       echo "<iframe border=\"0\" url=\"http://docs.google.com/gview?url=$dir$tag.pdf&amp;embedded=true\" style=\"width: 562px; height: 705px;\"> </iframe>";
       */

    echo

    "<iframe frameborder=\"0\" style=\"width: 562px; height: 705px;\" src=\"http://docs.google.com/gview?url=$dir/$tag.pdf&amp;embedded=true\"></iframe>"

    ;
    }


    /* =======================================================
       First some PHP input... find the date of the comparison
       ======================================================= */
    $YEAR="2011";
    $DATE="06-15-2011";
    $DIR="http://www.star.bnl.gov/~jwebb/".$YEAR."/".$DATE."/AgML-Comparison/";
    $TAGS=$DIR."TAGS";
    /* =======================================================
       Output header for this page
       ======================================================= */

    echo "<h3>STAR AgML vs AgSTAR Comparison on ".$DATE."</h3>";

    /* =======================================================
       Read in each line in the TAGs file
       ======================================================= */
    $handle = @fopen("$TAGS", "r");
    if (
    $handle) {
        while ((
    $buffer = fgets($handle, 4096)) !== false) {

           
    /* Trim the whitespace out of the string */
           
    $buffer=trim($buffer);

           
    /* Draw an HRULE and specify which geometry tag we are using */
           
    echo "<hr><p>STAR Geometry Tag $buffer</p>";

           
    /* Now build a 2-entry table with the status PNG on the left
               and the summary PDF ala google docs on the right */

           
    showImage( $buffer, $DIR );

           
    showGoogle( $buffer, $DIR );

        }
        if (!
    feof($handle)) {
            echo
    "Error: unexpected fgets() fail\n";
        }
       
    fclose($handle);
    }

    /************************************************************************ END OF PHP */
    ?>

    STAR AgML Language Reference

    STAR Geometry Page

    R&D Tags

    The R&D conducted for the inner tracking upgrade required that a few specialized geometry tags be created. For a complete set of geometry tags, please visit the STAR Geometry in simulation & reconstruction page. The below serves as additional documentation and details.

    Taxonomy:

    • SSD: Silicon strip detector
    • IST: Inner Silicon Tracker
    • HFT: Heavy Flavor Tracker
    • IGT: Inner GEM Tracker
    • HPD: Hybrid Pixel Detector

    The TPC is present in all configuration listed below and the SVT is in none.

       Tag    

     SSD IST HFT IGT HPD Contact Person  Comment

    UPGR01

    +

     

    +

     

     

       

    UPGR02

     

    +

    +

     

     

       

    UPGR03

     

    +

    +

    +

     

       

    UPGR04

    +

     

     

     

    +

     Sevil

    retired

    UPGR05

    +

    +

    +

    +

    +

     Everybody

    retired

    UPGR06

     +

     

    +

     

    +

     Sevil

    retired

    UPGR07

     +

    +

    +

    +

     

    Maxim

     

    UPGR08

     

    +

    +

    +

    +

    Maxim

     

    UPGR09

     

    +

    +

     

    +

    Gerrit

    retired  Outer IST layer only

    UPGR10

    +

    +

    +

       

    Gerrit

    Inner IST@9.5cm

    UPGR11

    +

    +

    +

     

     

    Gerrit

    IST @9.5&@17.0

    UPGR12

    +

    +

    +

    +

    +

    Ross Corliss

    retired  UPGR05*diff.igt.radii

    UPGR13

    +

    +

    +

    +

     

    Gerrit

    UPGR07*(new 6 disk FGT)*corrected SSD*(no West Cone)
    UPGR14  +    +  +    Gerrit  UPGR13 - IST  
    UPGR15 + + +     Gerrit  Simple Geometry for testing, Single IST@14cm, hermetic/polygon Pixel/IST geometry. Only inner beam pipe 0.5mm Be. Pixel 300um Si, IST 1236umSi  
    UPGR20  +         Lijuan   Y2007 + one TOF  
    UPGR21   +         Lijuan   UPGR20 + full TOF  

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

     

    Eta coverage of the SSD and HFT at different vertex spreads:

    Z cut, cm

    eta SSD eta HFT

    5

    1.63 

    2.00

    10

    1.72

    2.10

    20

    1.87

    2.30

    30

    2.00

    2.55

     

     

     

     

     

     

    Material balance studies for the upgrade: presented below are the usual radiation length plots (as a function of rapidity).

     

    Full UPGR05:

     

     

    Forward region: the FST and the IGT ONLY:

     

     

    Below, we plot the material for each individual detector, excluding the forward region to reduce ambiguity.

     

    SSD:

     

     

    IST:

     

     

    HPD:

     

     

    HFT:

     

    Event Filtering

    The attached PDF describes event filtering in the STAR framework.

    Event Generators

    Event Generator Framework
    Example macros for running event generators + starsim in ROOT:
    $ cvs co StRoot/StarGenerator/macros
    • starsim.pythia6.C
    • starsim.pythia8.C
    • starsim.hijing.C
    • starsim.herwig.C
    • starsim.pepsi.C
    • starsim.starlight.C
    • starsim.kinematics.C
    To run an example macro and generate 100 events:
    $ ln -s StRoot/StarGenerator/macros/starsim.pythia8.C starsim.C
    $ root4star -q -b starsim.C\(100\)
    
    This will generate two files.  A standard "fzd" file, which can be reconstructed using the big "full" chain (bfc.C).  And a root file, containing a TTree expressing the event record for the generated events.

    The new Event Record

    The event-wise and particle-wise information from event generators is saved in a ROOT/TTree.  The TTree can be read in in sync with the MuDst when you perform your analysis.  The ID truth values in the reconstructed tracks in the MuDst can be compared to the primary key of the tracks in the event record to identify generator tracks which were reconstructed by the tracker.

    The event record can be browsed using the standard ROOT ttree viewer.  Example for pythia 8:
    root [0] TFile::Open("pythia8.starsim.root")
    root [1] genevents->StartViewer()
    root [2] genevents->Draw("mMass","mStatus>0")
    
    The event record contains both particle-wise and event-wise information.  For the definitions of different quantities, see the documentation provided in the StarGenEvent links above.



    Adding a new Generator

    Event generators are responsible for creating the final state particles which are fed out to GEANT for simulation.  They can be as simple as particle guns, shooting individual particles along well defined trajectories, or complex hydrodynamical models of heavy-ion collisions.  Regardless of the complexities of the underlying physical model, the job of an event generator in the STAR framework is to add particles to an event record.  In this document we will describe the steps needed to add a new event generator to the STAR framework.  The document will be divided into three sections: (1)  An overview, presenting the general steps which are required; (2) A FORtran-specific HOWTO, providing guidance specific to the problem of interfacing FORtran with a C++ application; and (3) A document describing the STAR event record.

    Contents:
    1.0 Integrating Event Generators
    2.0 Integrating FORtran Event Generators
    3.0 The STAR Event Record


    1.0 Integrating Event Generators

    The STAR Event Generator Framework implements several C++ classes which facilitate the integration of FORtran and C++ event generators with the STAR simulation code.  The code is available in the CVS repository and can be checked out as
    $ cvs co StRoot/StarGenerator
    After checking out the generator area you will note that the code is organized into several directories, containing both CORE packages and concrete event generators.  Specifically:

    StarGenerator/BASE  -- contains the classes implementing the STAR interface to event generators
    StarGenerator/EVENT -- contains the classes implementing the STAR event record
    StarGenerator/UTIL  -- contains random number generator base class and particle data
    StarGenerator/TEST  -- contains test makers used for validating the event generators


    The concrete event generators (at the time this document was assembled) include

    StarGenerator/Hijing1_383
    StarGenerator/Pepsi
    StarGenerator/Pythia6_4_23
    StarGenerator/Pythia8_1_62


    1.1 Compiling your Generator

    Your first task in integrating a new event generator is to create a directory for it under StarGenerator, and get your code to compile.   You should select a name for your directory which includes the name and version of your event generator.  It should also be CamelCased...  MyGenerator1_2_3, for example.  (Do not select a name which is ALL CAPS, as this has a special meaning for compilation).  Once you have your directory, you can begin moving your source files into the build area.  In general, we would like to minimize the number of edits to the source code to make it compile.  But you may find that you need to reorganize the directory structure of your code to get it to compile under cons.  (For certain, if your FORtran source ends in ".f" you will need to rename the file to ".F", and if your C++ files end in ".cpp" or ".cc", you may need to rename to ".cxx".)

    1.2 Creating your Interface

    Ok.  So the code compiles.  Now we need to interface the event generation machinery with the STAR framework.  This entails several things.  First, we need to expose the configuration of the event generator so that the end user can generate the desired event sample.  We must then initialize the concrete event generator at the start of the run, and then exercise the event generation machinery on each and every event.  Finally, we need to loop over all of the particles which were created by the event generator and push them onto the event record so that they are persistent (i.e. the full event can be analyzed at a later date) and so that the particles are made available to the Monte Carlo application for simulation.

    The base class for all event generator interfaces is  StarGenerator

    Taking a quick look at the code, we see that there are several "standard" methods defined for configuring an event generator:
    These methods have been defined in order to establish a common interface amongst all event generators in STAR.  These methods set variables defined within the class, from which you will initialize your concrete event generator.
    You may need to implement additional methods in order to expose the configuration of your event generator.  You should, of course, do this.

    The two methods which StarGenerator requires you to implement are Init() and Generate().  These methods will respectively be called at the start of each run, and during each event.

    Init() is responsible for initializing the event generator.  In this method, you should pass any of the configuration information on to your concrete event generator.  This may be through calls to subroutines in your event generator, or by setting values in common blocks.  However this is done, this is the place to do it.

    Generate() will be called on every single event during the run.   This is where you should exercise the generation machinery of your event generator.  Every event generator handles this differently, so you will need to consult your manual to figure out the details.

    Once Generate() has been called, you are ready to fill the event record.  The event record consists of two parts: (1) the particle record, and (2) the event-wise information describing the physical interaction being simulated.  At a minimum, you will need to fill the particle-wise information.  For more details, see The STAR Event Record  below.


    2.0 Integrating FORtran Event Generators

    Interfacing a FORtran event generator with ROOT involves (three) steps:

    1. Interface the event generator's common blocks (at least the ones which we need to use) to C++
    2. Map those common blocks onto C++ structures
    3. Expose the C++ structures representing the common blocks to ROOT so that users may modify / access their contents

    Let's look at the pythia event generator for a concrete example. 

    If you examine the code in StRoot/StarGenerator/Pythia6_4_23/ there is a FORtran file named address.F.  Open that up in your favorite editor and have a look... You'll see several functions defined.  The first one is address_of_pyjets.  In it we declare the PYJETS common block, essentially just cutting and pasting the delcaration from the pythia source code in the same directory.
    We use the intrinsic function LOC to return the address (i.e. pointer) to the first variable in the common block.  We have just created a FORtran function which returns a pointer to the data stored in this common block.  The remaining functions in address.F simply expose the remaining common blocks in pythia which we want access to.  By calling this function on the C++ side, we will obtain a pointer to the memory address where the common block resides.

    Next we need to describe the memory layout to C++.  This is done in the file Pythia6.h.  Each common block setup in address.F has a corresponding structure defined in this header file.  So, let's take a look at the setup for the PyJets common block:




    First, notice the first line where we call the c-preprocessor macro "F77_NAME".  This line handles, in a portable way, the different conventions between FORtran and C++ compilers, when linking together object codes. 

    Next, let's discuss "memory layout".  In this section of the code we map the FORtran memory onto a C-structure.   Every variable in the common block should be declared in the same order in the C struct as it was declared in FORtran, and with the corresponding C data type.  These are:
    INTEGER          --> Int_t
    REAL             --> Float_t
    REAL *4          --> Float_t 
    REAL *8          --> Double_t 
    DOUBLE PRECISION --> Double_t
    You probably noticed that there are two differences with the way we have declared the arrays.  First, the arrays all were declared with an "_" in front of their name.  This was a choice on my part, which I will explain in a moment.  The important thing to notice right now is that the indicies on the arrays are reversed, compared to their declarion in FORtran.  "INTEGER K(4000,5)" in FORtran becomes "Int_t _k[5][4000]" in C++.  The reason for this is that C++ and FORtran represent arrays differently in memory.  It is important to keep these differences in mind when mapping the memory of a FORtran common block --

    1) The indices in the arrays will always be reversed between FORtran and C --   A(10,20,30) in FORtran becomes A[30][20][10] in C.
    2) FORtran indices (by default) start from 1, C++ (always) from 0 --  i.e. B(1) in FORtran would be B[0] in C.
    3) FORtran indices may start from any value.  An array declared as D(-10:10) would be declared in C as D[21], and D(-10) in FORtran is D[0] in C.

    What about the underscore?

    We need to make some design choices at this point.  Specifically, how do we expose the common blocks to the end user?  Do we want the end user to deal with the differences in C++ and FORtran, or do we want to provide a mechanism by which the FORtran behavior (i.e. count from 1, preserve the order of indices) can be emulated.

    My preference is to do the latter -- provide the end user functions which emulate the behavior of the FORtran arrays, because these arrays are what is documented in the event generator's manual.   This will minimize the likelyhood that the end user will make mistakes in configuring t he event generator.


    So we have created a c-struct which describes how the common block's memory is laid out, and we have defined a function in the FORtran library which returns the location of the memory address of the common block.  Now we need to expose that function to C++.  To do that, we need to declare a prototype of the function.  There are two things we need to do.

    First, we need to define the name of the subroutine in a portable way.  This is done using a macro defined in #include "StarCallf77.h" --

    #define address_of_pyjets F77_NAME( address_of_pyjets, ADDRESS_OF_PYJETS )

    Next we need to declare to C++ that address_of_pyjets can be found in an external library, and will return a pointer to the PyJets_t structure

    extern "C" PyJets_t *address_of_pyjets();

    Now we are almost done.  We need to add a function in our generator class which returns a pointer (or reference) to the common blocks, and we need to add PyJets_t to the ROOT dictionary... In MyGeneratorLinkDef.h, add the line

    #pragma link C++ struct PyJets_t+;

    Finally, you need to expose the FORtran subroutines to C++.  Again, take a look at the code in Pythia6.h and Pythia6.cxx.  In the header we declare wrapper functions around the FORtran subroutines, and in the implementation file we expose the FORtran subroutines to C++.

    Our first step is declaring the prototypes of the subroutines and implementing the C++ infterface.  Consider the SUBROUTINE PYINIT in pythia, which  initializes the event generator.  In FORtran it is declared as

          SUBROUTINE PYINIT(FRAME,BEAM,TARGET,WIN)
          IMPLICIT DOUBLE PRECISION(A-H, O-Z)
          IMPLICIT INTEGER(I-N)
    ...
          CHARACTER*(*) FRAME,BEAM,TARGET


    So the variables FRAME, BEAM and TARGET are declared as character variables, and WIN is implicitly a double precision variable.

    There are several webpages which show how to interface fortran and c++, e.g. http://www.yolinux.com/TUTORIALS/LinuxTutorialMixingFortranAndC.html

    It is really a system-dependent thing.  We're keeping it simple and only supporting Linux. 

    #define pyinit F77_NAME(pyinit,PYINIT) /* pythia initialization */
    extern "C" void   type_of_call  pyinit( const char *frame,
                                            const char *beam,
                                            const char *targ,
                                            double *ener,
                                            int nframe,
                                            int nbeam,
                                            int ntarg );

    So there's three character varaibles declared: frame, beam and targ.  These correspond to the character variables on the FORtran side.  Also there's a double precision variable ener... this is WIN.   But then there are three integer variables nframe, nbeam  and ntarg.  FORtran expects to get the size of the character variables when the subroutine is called, and it gets them after the last arguements in the list.

    Again, we would like to hide this from the end user... so I like to define wrapper functions such as

    #include <string>

    void PyInit( string frame, string blue, string yellow, double energy )
    {
       pyinit( frame.c_str(), blue.c_str(), yellow.c_str(), &energy,
               frame.size(),
               blue.size(),
               yellow.size() );
    }


    3.0 The STAR Event Record

    List of Event Generators

    Event generators currently integrated into starsim using the root4star framework (11/29/12):

    • Pythia 6.4.23
    • Pythia 8.1.62
    • Hijing 1.383
    • Herwig 6.5.20
    • StarLight
    • Pepsi

    To run, checkout StRoot/StarGenerator/macros and modify the appropriate example ROOT macro for your purposes.

    Event generators currently implemented in the starsim framework (11/29/12):

    • Hijing 1.381
    • Hijing 1.382
    • Pythia 6.2.05
    • Pythia 6.2.20
    • Pythia 6.4.10
    • Pythia 6.4.22
    • Pythia 6.4.26
    • StarLight (fortran)

    The STAR Event Record

    Geometry Tags



    Geometry Tag y2000
       CaveGeo   PipeGeo   UpstGeo   SvttGeo   tpcegeo
      BtofGeo2   CalbGeo   ZcalGeo   MagpGeo


    Geometry Tag y2001
       CaveGeo   PipeGeo   UpstGeo  SvttGeo1   tpcegeo
       FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo   CalbGeo
       richgeo   EcalGeo   ZcalGeo   MagpGeo


    Geometry Tag y2002
       CaveGeo   PipeGeo   UpstGeo   SvttGeo   tpcegeo
       FtpcGeo   SupoGeo  BtofGeo2   VpddGeo   CalbGeo
       richgeo   BbcmGeo   fpdmgeo   ZcalGeo   MagpGeo


    Geometry Tag y2003
       CaveGeo   PipeGeo   UpstGeo   SvttGeo   tpcegeo
       FtpcGeo   SupoGeo  BtofGeo2   VpddGeo   CalbGeo
       EcalGeo   BbcmGeo   fpdmgeo   ZcalGeo   MagpGeo


    Geometry Tag y2003x
       CaveGeo   PipeGeo   UpstGeo  SvttGeo2   tpcegeo
       FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo   CalbGeo
       EcalGeo   BbcmGeo   fpdmgeo   ZcalGeo   MagpGeo
       PhmdGeo


    Geometry Tag y2004a
       CaveGeo   PipeGeo   UpstGeo  SvttGeo3   tpcegeo
       FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo  CalbGeo1
       EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo   MagpGeo
       SisdGeo   PhmdGeo


    Geometry Tag y2004c
       CaveGeo   PipeGeo   UpstGeo  SvttGeo4  TpceGeo1
       FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo  CalbGeo1
       EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo   MagpGeo
      SisdGeo1   PhmdGeo


    Geometry Tag y2004y
       CaveGeo   PipeGeo   UpstGeo  SvttGeo4  TpceGeo1
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo2   VpddGeo
       CalbGeo   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo2   PhmdGeo


    Geometry Tag y2005
       CaveGeo   PipeGeo   UpstGeo  SvttGeo3   tpcegeo
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo2   VpddGeo
      CalbGeo1   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo2   PhmdGeo


    Geometry Tag y2005b
       CaveGeo   PipeGeo   UpstGeo  SvttGeo4  TpceGeo1
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo2   VpddGeo
      CalbGeo1   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo2   PhmdGeo


    Geometry Tag y2005f
       CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo1
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo6   PhmdGeo


    Geometry Tag y2005g
       CaveGeo   PipeGeo   UpstGeo SvttGeo11  TpceGeo1
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo6   PhmdGeo


    Geometry Tag y2005h
       CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo6   PhmdGeo


    Geometry Tag y2005i
       CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo6   PhmdGeo


    Geometry Tag y2006
       CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo2
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo1   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
       MagpGeo  SisdGeo3   MutdGeo   PhmdGeo


    Geometry Tag y2006c
       CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo2
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo2   ZcalGeo
       MagpGeo  SisdGeo6   MutdGeo


    Geometry Tag y2006g
       CaveGeo   PipeGeo   UpstGeo SvttGeo11  TpceGeo2
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo2   ZcalGeo
       MagpGeo  SisdGeo6   MutdGeo


    Geometry Tag y2006h
       CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
      CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo2   ZcalGeo
       MagpGeo  SisdGeo6   MutdGeo


    Geometry Tag y2007
       CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo2
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo5  VpddGeo2
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  SisdGeo6   MutdGeo   PhmdGeo


    Geometry Tag y2007g
       CaveGeo   PipeGeo   UpstGeo SvttGeo11  TpceGeo2
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo5  VpddGeo2
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  SisdGeo6   MutdGeo   PhmdGeo


    Geometry Tag y2007h
       CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo5  VpddGeo2
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  SisdGeo6   MutdGeo   PhmdGeo


    Geometry Tag y2008
       CaveGeo   PipeGeo   UpstGeo  TpceGeo2  FtpcGeo1
      SupoGeo1   FtroGeo  BtofGeo6  VpddGeo2  CalbGeo2
       EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo   MagpGeo
      MutdGeo3


    Geometry Tag y2008a
       CaveGeo   PipeGeo   UpstGeo   SconGeo  TpceGeo2
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo6  VpddGeo2
      CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  MutdGeo3


    Geometry Tag y2008b
       CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo6  VpddGeo2
      CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  MutdGeo3


    Geometry Tag y2008c
       CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo7  VpddGeo2
      CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  MutdGeo3


    Geometry Tag y2008d
       CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo7  VpddGeo2
      CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  MutdGeo3


    Geometry Tag y2008e
       CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
      FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo7  VpddGeo2
      CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
       MagpGeo  MutdGeo3


    Geometry Tag y2009
       EcalGeo   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
       BbcmGeo


    Geometry Tag y2009a
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
       BbcmGeo


    Geometry Tag y2009b
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
       BbcmGeo


    Geometry Tag y2009c
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
       BbcmGeo


    Geometry Tag y2009d
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
       BbcmGeo


    Geometry Tag y2010
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo


    Geometry Tag y2010a
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo


    Geometry Tag y2010b
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo


    Geometry Tag y2010c
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
       SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo


    Geometry Tag y2011
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo4 TpceGeo3a  CalbGeo2
       SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo


    Geometry Tag y2011a
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      FtpcGeo1   FtroGeo  MutdGeo4 TpceGeo3a  CalbGeo2
       SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo


    Geometry Tag y2012
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
       CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1


    Geometry Tag y2012a
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
       CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1


    Geometry Tag y2012b
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
       CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1


    Geometry Tag y2013
      EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
       CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
      PixlGeo5  PxstGeo1  DtubGeo1


    Geometry Tag y2013_1
      EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
       CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
      PixlGeo5  PxstGeo1  DtubGeo1


    Geometry Tag y2013_2
      EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
       CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
      PxstGeo1


    Geometry Tag y2013_1x
      EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
      CaveGeo2   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
      PixlGeo5  PxstGeo1  DtubGeo1


    Geometry Tag y2013x
      EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
      CaveGeo2   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
      PixlGeo5  PxstGeo1  DtubGeo1


    Geometry Tag y2013_2x
      EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
      CaveGeo2   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
      PxstGeo1


    Geometry Tag dev14
      EcalGeo6  PipeGeo1  FpdmGeo3  BtofGeo7  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   CaveGeo   BbcmGeo
      SisdGeo7  FgtdGeo3  IdsmGeo1  PixlGeo4  IstdGeo0
      PxstGeo1


    Geometry Tag complete
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      MutdGeo4 TpceGeo3a  CalbGeo2   PhmdGeo   UpstGeo
       ZcalGeo   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3
      IdsmGeo1


    Geometry Tag devT
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      MutdGeo4  CalbGeo2   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1   FsceGeo
       EiddGeo  TpcxGeo1


    Geometry Tag eStar2
      EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
      MutdGeo4  CalbGeo2   UpstGeo   ZcalGeo   CaveGeo
       MagpGeo   BbcmGeo  FgtdGeoV  IdsmGeo1   FsceGeo
       EiddGeo  TpcxGeo2

    Material Budget Y2013

    Material budget in the Y2013 (X) geometry.  The top left plots number of radiation lengths encounted by a straight track at the given eta, phi.  The top right (bottom left) compares the ROOT and STARSIM geometries generated by AgML plotted vs phi (eta).  These are averaged over the other variable.  ROOT geometry in black, STARSIM in red.  The bottom right shows the difference in ROOT - STARSIM geometries vs phi and eta.  Less than 0.01 radiation lengths difference found integrated over the entire cave. 

    Attached are material budget plots and differences for major subsystems.  Each PDF contains the material budget plots displaying number of radiation lengths averaged over all phi for the ROOT (left) and STARSIM (right) geometries created by AgML. The material difference plot is as described above.

    Miscellaneous production scripts

    This page has been created with the purpose to systematize the various scripts currently used in the Monte Carlo production and testing. The contents will be updated as needed, however the codes are presumed to be correct and working at any given time.

    Jobs catalog

    When running on rcas, we typically use a legacy csh script named "alljobs". It parses the job configuration file named "catalog" and dispatches a single job on the target node, which can be an interactive node if run interactively, or a batch node if submitted to a queue. The alljobs script expects to see the following directory structure: a writeable directory with the name of the dataset being produced, and directly under it, a writeable "log" directory, in which it would deposit the so-called token files, which serve two purposes:

    • help in sequential numbering of the output files
    • in case of multiple input files (such as Hijing event files) allow to map N files to M jobs, thus managing the workload

    The catalog file is effectively a table, in white-space separated format. Each line begins with the dataset name which is a three-letter acronym of the site name (and thus either rcf or pds) followed by a 4-digit serial number of the set. The alljobs script expects to find a directory named identically to the dataset, under the "job directory", which in the current version of the script is hardcoded as /star/simu/simu/gstardata. This, of course, can be improved or changed.

    The last field in each entry is used to construct the so-called tag, which plays an important role: it effectively defined the location of the Monte Carlo data in the HPSS, when the data is sunk there (this is done by a separate script). In addition, it also defines keys for the entries in the FileCatalog (reconstructed data). The alljobs script creates a file of zero length, with a name which is a period-separated catenation of the word "tag" and the contents of the last column in the line.

    Here are the contents of the catalog file as it existed from the late 1990-s to the end of 2006

    rcf0101 auau200/nexus/default/central evgen.*.nt auau200.nexus.default.b0_3.year_1h.hadronic_on
    rcf0105 auau200/nexus/default/minbias evgen.*.nt auau200.nexus.default.minbias.year_1h.hadronic_on
    pds0101 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
    pds0102 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
    pds0103 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
    pds0104 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
    rcf0096 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on

    pds0105 auau200/mevsim/vanilla_trigger/central evgen.*.nt auau200.mevsim.vanilla.trigger.year_1h.hadronic_on
    rcf0097 auau200/mevsim/vanilla_resonance/central evgen.*.nt auau200.mevsim.vanilla.resonance.year_1h.hadronic_on
    rcf0098 auau200/mevsim/vanilla_trigger/central evgen.*.nt auau200.mevsim.vanilla.trigger.year_1h.hadronic_on
    rcf0095 auau200/mevsim/vanilla_flow/central evgen.*.nt auau200.mevsim.vanilla.flow.year_1h.hadronic_on
    rcf0099 auau200/mevsim/vanilla_fluct/central evgen.*.nt auau200.mevsim.vanilla.fluct.year_1h.hadronic_on
    rcf0102 auau200/mevsim/vanilla/central evgen.*.nt auau200.mevsim.vanilla.central.year_1h.hadronic_on
    rcf0103 auau200/mevsim/vanilla/central evgen.*.nt auau200.mevsim.vanilla.central.year_1h.hadronic_on
    rcf0104 auau200/mevsim/vanilla_flow/central evgen.*.nt auau200.mevsim.vanilla.flow.year_1h.hadronic_on
    rcf0100 auau200/mevsim/cascade/central evgen.*.nt auau200.mevsim.cascade.central.year_1h.hadronic_on

    rcf0106 auau200/hbt/default/peripheral evgen.*.nt auau200.hbt.default.peripheral.year_1h.hadronic_on
    rcf0107 auau200/hbt/default/midperipheral evgen.*.nt auau200.hbt.default.midperipheral.year_1h.hadronic_on
    rcf0108 auau200/hbt/default/middle evgen.*.nt auau200.hbt.default.middle.year_1h.hadronic_on
    rcf0109 auau200/hbt/default/midcentral evgen.*.nt auau200.hbt.default.midcentral.year_1h.hadronic_on
    rcf0110 auau200/hbt/default/central evgen.*.nt auau200.hbt.default.central.year_1h.hadronic_on

    rcf0111 none hijing.*.xdf auau200.hijing.b0_3_jetq_on.jet05.year_1h.hadronic_on
    rcf0112 none hijing.*.xdf auau200.hijing.b0_3_jetq_off.jet05.year_1h.hadronic_on
    rcf0113 none hijing.*.xdf auau200.hijing.b8_15_jetq_on.jet05.year_1h.hadronic_on
    rcf0114 none hijing.*.xdf auau200.hijing.b8_15_jetq_off.jet05.year_1h.hadronic_on
    rcf0115 none hijing.*.xdf auau200.hijing.b0_3_jetq_on.jet05.year_1h.hadronic_on
    rcf0116 none hijing.*.xdf auau200.hijing.b0_3_jetq_off.jet05.year_1h.hadronic_on
    rcf0117 none hijing.*.xdf auau200.hijing.b8_15_jetq_on.jet05.year_1h.hadronic_on
    rcf0118 none hijing.*.xdf auau200.hijing.b8_15_jetq_off.jet05.year_1h.hadronic_on
    rcf0119 none hijing.*.xdf pau200_hijing_b0_7_jet15_year_1h.hadronic_on
    rcf0120 none hijing.*.xdf pau200_hijing_b0_7_gam15_year_1h_hadronic_on

    rcf0121 pec/dtunuc dtu*.xdr auau200.dtunuc.two_photon.none.year_1h.hadronic_on
    rcf0122 pec/starlight starlight_vmeson_*.t auau200.starlight.vmeson.none.year_1h.hadronic_on
    rcf0123 pec/starlight starlight_2gamma_*.nt auau200.starlight.2gamma.none.year_1h.hadronic_on
    rcf0124 pec/hemicosm events.txt auau200.hemicosm.default.none.year_1h.hadronic_on

    rcf0125 pec/dtunuc dtu*.xdr auau200.dtunuc.two_photon.halffield.year_1h.hadronic_on
    rcf0126 pec/starlight starlight_vmeson_*.t auau200.starlight.vmeson.halffield.year_1h.hadronic_on
    rcf0127 pec/starlight starlight_2gamma_*.t auau200.starlight.2gamma.halffield.year_1h.hadronic_on

    rcf0131 pec/beamgas venus.h.*.nt auau200.hijing.beamgas.hydrogen.year_1h.hadronic_on
    rcf0132 pec/beamgas venus.n.*.nt auau200.hijing.beamgas.nitrogen.year_1h.hadronic_on

    rcf0139 none hijev.inp auau128.hijing.b0_12.halffield.year_1e.hadronic_on
    rcf0140 none hijev.inp auau128.hijing.b0_3.halffield.year_1e.hadronic_on

    rcf0141 auau200/strongcp/broken/eb_400_90 evgen.*.nt auau200.strongcp.broken.eb-400_90.year_1h.hadronic_on
    rcf0142 auau200/strongcp/broken/eb_400_00 evgen.*.nt auau200.strongcp.broken.eb-400_00.year_1h.hadronic_on
    rcf0143 auau200/strongcp/broken/lr_eb_400_90 evgen.*.nt auau200.strongcp.broken.lr_eb_400_90.year_1h.hadronic_on

    rcf0145 none hijev.inp auau130.hijing.b0_3.jet05.year_1h.halffield.hadronic_on
    rcf0146 none hijev.inp auau130.hijing.b0_15.default.year_1h.halffield.hadronic_on
    rcf0147 none hijev.inp auau130.hijing.b0_3.default.year_1e.halffield.hadronic_on
    rcf0148 none hijev.inp auau130.hijing.b3_6.default.year_1e.halffield.hadronic_on

    rcf0151 auau130/mevsim/vanilla_flow/central evgen.*.nt auau130.mevsim.vanilla_flow.central.year_1e.hadronic_on
    rcf0152 auau130/mevsim/vanilla_trigger/central evgen.*.nt auau130.mevsim.vanilla_trigger.central.year_1e.hadronic_on
    rcf0153 auau130/mevsim/vanilla_dynamic/central evgen.*.nt auau130.mevsim.vanilla_dynamic.central.year_1e.hadronic_on
    rcf0154 auau130/mevsim/vanilla_omega/central evgen.*.nt auau130.mevsim.vanilla_omega.central.year_1e.hadronic_on
    rcf0155 auau130/mevsim/vanilla/central evgen.*.nt auau130.mevsim.vanilla.central.year_1e.hadronic_on
    rcf0156 auau130/nexus/default/central evgen.*.nt auau130.nexus.default.b0_3.year_1e.hadronic_on

    rcf0159 rqmd auau_b0-14.*.cwn auau200.rqmd.default.b0_14.year_1h.hadronic_on
    rcf0160 rqmd auau_b0-15.*.cwn auau200.rqmd.default.b0_15.year_1h.hadronic_on

    rcf0161 auau130/mevsim/vanilla_flow/central evgen.*.nt auau130.mevsim.vanilla_flow.central.year_1h.hadronic_on
    rcf0162 auau130/mevsim/vanilla_trigger/central evgen.*.nt auau130.mevsim.vanilla_trigger.central.year_1h.hadronic_on
    rcf0163 auau130/mevsim/vanilla_dynamic/central evgen.*.nt auau130.mevsim.vanilla_dynamic.central.year_1h.hadronic_on
    rcf0164 auau130/mevsim/vanilla_omega/central evgen.*.nt auau130.mevsim.vanilla_omega.central.year_1h.hadronic_on
    rcf0165 auau130/mevsim/vanilla/central evgen.*.nt auau130.mevsim.vanilla.central.year_1h.hadronic_on
    rcf0166 auau130/mevsim/vanilla_resonance/central evgen.*.nt auau130.mevsim.vanilla_resonance.central.year_1h.hadronic_on
    pds0167 auau130/mevsim/vanilla_cocktail/central evgen.*.nt auau130.mevsim.vanilla_cocktail.central.year_1h.hadronic_on
    rcf0168 auau130/mevsim/vanilla_flow/mbias evgen.*.nt auau130.mevsim.vanilla_flow.minbias.year_1h.hadronic_on
    rcf0169 auau130/mevsim/vanilla_flowb/central evgen.*.nt auau130.mevsim.vanilla_flowb.central.year_1h.hadronic_on

    rcf0171 auau130/mevsim/vanilla_lambda_antilambda/central evgen.*.nt auau130.mevsim.vanilla_both_lambda.central.year_1h.hadronic_on
    rcf0172 auau130/mevsim/vanilla_lambda/central evgen.*.nt auau130.mevsim.vanilla_lambda.central.year_1h.hadronic_on
    rcf0173 auau130/mevsim/vanilla_antilambda/central evgen.*.nt auau130.mevsim.vanilla_antilambda.central.year_1h.hadronic_on

    rcf0181 auau200/mevsim/mdc4/central evgen.*.nt auau200.mevsim.mdc4_cocktail.central.year2001.hadronic_on
    pds0182 auau200/mevsim/mdc4/central evgen.*.nt auau200.mevsim.mdc4_cocktail.central.year2001.hadronic_on

    rcf0183 none hijev.inp auau200.hijing.b0_20.standard.year2001.hadronic_on
    rcf0184 none hijev.inp auau200.hijing.b0_3.standard.year2001.hadronic_on

    rcf0190 auau200/mevsim/mdc4_electrons evgen.*.nt auau200.mevsim.mdc4_electrons.year2001.hadronic_on

    rcf0191 none hijev.inp auau200.hijing.b0_20.inverse.year2001.hadronic_on
    rcf0192 none hijev.inp auau200.hijing.b0_3.inverse.year2001.hadronic_on
    rcf0193 none hijev.inp dau200.hijing.b0_20.standard.year_2a.hadronic_on

    # Maxim has arrived:
    # the following two runs had the 1 6 setting for the hard scattering and energy
    rcf0194 none hijev.inp dau200.hijing.b0_20.jet06.year2003.hadronic_on
    pds0195 none hijev.inp dau200.hijing.b0_20.jet06.year2003.hadronic_on
    # this one had 1 3
    rcf0196 none hijev.inp dau200.hijing.b0_20.jet03.year2003.hadronic_on
    # standard 0 2 setting
    rcf0197 none hijev.inp dau200.hijing.b0_20.jet02.year2003.hadronic_on
    # new numbering
    rcf1197 none hijev.inp dau200.hijing.b0_20.minbias.year2003.hadronic_on
    rcf1198 dau200/hijing_382/b0_20/minbias evgen.*.nt dau200.hijing_382.b0_20.minbias.year2003.gheisha_on
    # dedicated wide Z run
    rcf1199 dau200/hijing_382/b0_20/minbias evgen.*.nt dau200.hijing_382.b0_20.minbias_wideZ.year2003.hadronic_on
    # Pythia
    rcf1200 none pyth.dat pp200.pythia6_203.default.minbias.year2003.hadronic_on
    # Heavy flavor embedding with full calorimeter
    rcf1201 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2003x.gheisha_on
    # Pythia hi Pt>5
    rcf1202 none pyth.dat pp200.pythia6_203.default.pt5.year2003.gheisha_on
    # Mevsim fitted to 200GeV AuAu
    rcf1203 auau200/mevsim/v2/central_6 evgen.*.nt auau200.mevsim.v2.b0_6.year_1e.gheisha_on
    # Mevsim fitted to 200GeV AuAu, different geo
    rcf1204 auau200/mevsim/v2/central_6 evgen.*.nt auau200.mevsim.v2.b0_6.year2001.gheisha_on
    # Pythia hi Pt>15
    rcf1205 none pyth.dat pp200.pythia6_203.default.pt15.year2003.gheisha_on
    # Starsim maiden voyage, with y2004, 62.4 GeV
    rcf1206 auau62/hijing_382/b0_20/minbias evgen.*.nt auau62.hijing_382.b0_20.minbias.y2004.gheisha_on
    # 62.4 GeV central
    rcf1207 auau62/hijing_382/b0_3/central evgen.*.nt auau62.hijing_382.b0_3.central.y2004a.gheisha_on
    pds1207 auau62/hijing_382/b0_3/central evgen.*.nt auau62.hijing_382.b0_3.central.y2004a.gheisha_on
    # 200 GeV minbias
    rcf1208 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2004a.gheisha_on
    # 200 GeV central
    rcf1209 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2004a.gheisha_on
    # Pythia
    rcf1210 none pyth.dat pp200.pythia6_203.default.minbias.y2004a.gheisha_on
    # Pythia Spin group
    rcf1211 none pyth.dat pp200.pythia6_203.default.minbias.y2004x.gheisha_on
    # Pythia Spin group
    pds1212 none pyth.dat pp200.pythia6_203.default.pt3.y2004x.gheisha_on
    # Pythia Spin group
    rcf1213 none pyth.dat pp200.pythia6_205.default.pt7.y2004x.gheisha_on
    # Pythia Spin group
    pds1214 none pyth.dat pp200.pythia6_203.default.pt15.y2004x.gheisha_on
    # 200 GeV minbias special D decays
    rcf1215 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.speciald.y2004a.gheisha_on
    # 200 GeV minbias copper
    rcf1216 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2005x.gheisha_on
    # 200 GeV minbias copper test
    rcf1217 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2004a.gheisha_on
    # 200 GeV central reprise of 1209, smaller diamond
    rcf1218 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2004a.gheisha_on
    # Pythia Special 1
    rcf1219 none pyth.dat pp200.pythia6_203.default.special1.y2004c.gheisha_on
    # Pythia Special 2 (CDF A)
    rcf1220 none pyth.dat pp200.pythia6_203.default.special2.y2004c.gheisha_on
    # Pythia Special 3
    rcf1221 none pyth.dat pp200.pythia6_203.default.special3.y2004c.gheisha_on
    # Pythia Special 2 4<Pt<5 Gheisha
    rcf1222 none pyth.dat pp200.pythia6_203.default.special2.y2004y.gheisha_on
    # Pythia Special 2 4<Pt<5 GCALOR
    rcf1223 none pyth.dat pp200.pythia6_203.default.special2.y2004y.gcalor_on
    # Pythia Special 2 (CDF A) 5-7 GeV 6/28/05
    rcf1224 none pyth.dat pp200.pythia6_205.5_7gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 7-9 GeV 6/28/05
    rcf1225 none pyth.dat pp200.pythia6_205.7_9gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 9-11 GeV 6/28/05
    rcf1226 none pyth.dat pp200.pythia6_205.9_11gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 11-15 GeV 6/28/05
    rcf1227 none pyth.dat pp200.pythia6_205.11_15gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 15-25 GeV 6/29/05
    rcf1228 none pyth.dat pp200.pythia6_205.15_25gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 25-35 GeV 6/29/05
    rcf1229 none pyth.dat pp200.pythia6_205.25_35gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) > 35 GeV 6/29/05
    rcf1230 none pyth.dat pp200.pythia6_205.above_35gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 2-3 GeV 6/29/05
    rcf1231 none pyth.dat pp200.pythia6_205.2_3gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 3-4 GeV 6/30/05
    rcf1232 none pyth.dat pp200.pythia6_205.3_4gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 4-5 GeV 6/30/05
    rcf1233 none pyth.dat pp200.pythia6_205.4_5gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 6/30/05
    rcf1234 none pyth.dat pp200.pythia6_205.low_energy.cdf_a.y2004y.gheisha_on
    # Pythia min bias 9/06/05
    rcf1235 none pyth.dat pp200.pythia6_205.min_bias.cdf_a.y2004y.gheisha_on
    # Herwig 5-7 GeV 9/07/05
    rcf1236 pp200/herwig6507/pt_5_7 evgen.*.nt pp200.herwig6507.5_7gev.special1.y2004y.gheisha_on
    # 62.4 GeV minbias copper
    rcf1237 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.minbias.y2005c.gheisha_on
    # 100 pi0 per event SVT in
    rcf1238 none run1238.kumac pi0.100per_event.200mev_15gev.svtt_on.y2005x.gheisha_on
    # 100 pi0 per event SVT out
    rcf1239 none run1239.kumac pi0.100per_event.200mev_15gev.svtt_off.y2005x.gheisha_on
    # 10 J/psi per event SVT in
    rcf1240 none run1240.kumac jpsi.10per_event.500mev_3gev.svtt_on.y2005x.gheisha_on
    # 10 J/psi per event SVT out
    rcf1241 none run1241.kumac jpsi.10per_event.500mev_3gev.svtt_off.y2005x.gheisha_on
    # 62.4 GeV minbias copper low EM cut
    rcf1242 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em.y2005c.gheisha_on
    # 62.4 GeV minbias copper low EM and keep tracks
    rcf1243 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em_keep.y2005c.gheisha_on
    # Herwig 9-11 GeV 10/13/05
    rcf1244 pp200/herwig6507/pt_9_11 evgen.*.nt pp200.herwig6507.9_11gev.special1.y2004y.gheisha_on
    # Herwig 11-15 GeV 10/13/05
    rcf1245 pp200/herwig6507/pt_11_15 evgen.*.nt pp200.herwig6507.11_15gev.special1.y2004y.gheisha_on
    # Herwig 15-25 GeV 10/13/05
    rcf1246 pp200/herwig6507/pt_15_25 evgen.*.nt pp200.herwig6507.15_25gev.special1.y2004y.gheisha_on
    # Herwig 25-35 GeV 10/13/05
    rcf1247 pp200/herwig6507/pt_25_35 evgen.*.nt pp200.herwig6507.25_35gev.special1.y2004y.gheisha_on
    # Herwig 35-45 GeV 10/13/05
    rcf1248 pp200/herwig6507/pt_35_45 evgen.*.nt pp200.herwig6507.35_45gev.special1.y2004y.gheisha_on
    # 200 GeV minbias
    rcf1249 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2005d.gheisha_on
    #
    # New Herwig Wave
    #
    # Herwig 9-11 GeV new header 12/14/05
    rcf1250 pp200/herwig6507/pt_9_11 evgen.*.nt pp200.herwig6507.9_11gev.special3.y2004y.gheisha_on
    # Herwig 11-15 GeV new header 11/10/05
    rcf1251 pp200/herwig6507/pt_11_15 evgen.*.nt pp200.herwig6507.11_15gev.special3.y2004y.gheisha_on
    # Herwig 15-25 GeV new header 12/19/05
    rcf1252 pp200/herwig6507/pt_15_25 evgen.*.nt pp200.herwig6507.15_25gev.special3.y2004y.gheisha_on
    # Herwig 25-35 GeV new header 12/19/05
    rcf1253 pp200/herwig6507/pt_25_35 evgen.*.nt pp200.herwig6507.25_35gev.special3.y2004y.gheisha_on
    # Herwig 35-100 GeV new header 12/19/05
    rcf1254 pp200/herwig6507/pt_35_100 evgen.*.nt pp200.herwig6507.35_100gev.special3.y2004y.gheisha_on
    # Herwig 2-3 GeV new header 12/14/05
    rcf1255 pp200/herwig6507/pt_2_3 evgen.*.nt pp200.herwig6507.2_3gev.special3.y2004y.gheisha_on
    # Herwig 3-4 GeV new header 12/14/05
    rcf1256 pp200/herwig6507/pt_3_4 evgen.*.nt pp200.herwig6507.3_4gev.special3.y2004y.gheisha_on
    # Herwig 4-5 GeV new header 12/21/05
    rcf1257 pp200/herwig6507/pt_4_5 evgen.*.nt pp200.herwig6507.4_5gev.special3.y2004y.gheisha_on
    # Herwig 5-7 GeV new header 12/21/05
    rcf1258 pp200/herwig6507/pt_5_7 evgen.*.nt pp200.herwig6507.5_7gev.special3.y2004y.gheisha_on
    # Herwig 7-9 GeV new header 12/21/05
    rcf1259 pp200/herwig6507/pt_7_9 evgen.*.nt pp200.herwig6507.7_9gev.special3.y2004y.gheisha_on
    #
    # Heavy flavor embedding with full calorimeter
    rcf1260 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2004y.gheisha_on
    # 200 GeV minbias copper
    rcf1261 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2006.gheisha_on
    # 62.4 GeV minbias copper
    rcf1262 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.minbias.y2006.gheisha_on
    #
    #

    # Specialized tracking studies
    #
    # 62.4 GeV minbias copper low EM and keep tracks
    rcf1263 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em_keep.y2005d.gheisha_on
    # Same as prev, distortion
    rcf1264 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.distort.y2005d.gheisha_on
    # Same as prev, distortion with clams
    rcf1265 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.clamdist.y2005d.gheisha_on
    # Same as prev, clams and two ladders offset
    rcf1266 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.clamlad.y2005d.gheisha_on
    # Individual ladder offsets
    rcf1267 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.indilad.dev2005.gheisha_on
    # Global ladder tilts
    rcf1268 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.ladtilt.dev2005.gheisha_on
    # Individual ladder tilts
    rcf1269 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.indtilt.dev2005.gheisha_on
    #
    #
    # Spin PWG requests:
    # Pythia Special 2 (CDF A) 45-55 GeV 5/09/06
    rcf1270 none pyth.dat pp200.pythia6_205.45_55gev.cdf_a.y2004y.gheisha_on
    # Pythia Special 2 (CDF A) 55-65 GeV 5/10/06
    rcf1271 none pyth.dat pp200.pythia6_205.55_65gev.cdf_a.y2004y.gheisha_on
    #
    rcf1272 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.D0minbias.y2006.gheisha_on
    #
    # Pythia Special 2 (CDF A) 0-2 GeV 7/20/06
    rcf1273 none pyth.dat pp200.pythia6_205.0_2gev.cdf_a.y2004y.gheisha_on
    # UPGR02 eta+-1.5
    rcf1274 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr02.gheisha_on
    #
    # Pythia min bias 7/27/06
    rcf1275 none pyth.dat pp200.pythia6_205.minbias.cdf_a.y2006.gheisha_on
    #
    # UPGR05
    rcf1276 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr05.gheisha_on
    #
    # UPGR05 wide diamond (60,300)
    rcf1277 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.wide.upgr05.gheisha_on
    # UPGR07 wide diamond (60,300)
    rcf1278 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.wide.upgr07.gheisha_on
    # UPGR07
    rcf1279 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr07.gheisha_on
    # UPGR01
    rcf1280 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr01.gheisha_on
    # UPGR08
    rcf1281 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr08.gheisha_on
    # UPGR06
    rcf1282 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr06.gheisha_on
    # UPGR09
    rcf1283 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr09.gheisha_on
    # UPGR09 central
    rcf1284 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr09.gheisha_on
    # UPGR10
    rcf1285 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr10.gheisha_on
    # UPGR10 central
    rcf1286 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr10.gheisha_on
    # UPGR11
    rcf1287 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr11.gheisha_on
    # UPGR11 central
    rcf1288 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr11.gheisha_on
    # UPGR06 central
    rcf1289 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr06.gheisha_on
    # UPGR07
    rcf1290 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr07.gheisha_on

    Here is the actual version of the file used in the 2007 runs:

    e w en b jq geom
    # UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
    rcf1291 none pyth.dat pp200.pythia6_205.special.diamond10.upgr07.gheisha_on
    # UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
    rcf1292 none pyth.dat pp500.pythia6_205.special.diamond10.upgr07.gheisha_on
    # UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
    rcf1293 none pyth.dat pp200.pythia6_205.special.diamond30.upgr07.gheisha_on
    # UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
    rcf1294 none pyth.dat pp500.pythia6_205.special.diamond30.upgr07.gheisha_on
    # Min bias gold, pilot run for 2007
    rcf1295 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2007.gheisha_on
    # Central auau200 + B-mixing Central auau200 + Upsilon (S1,S2,S3) mixing
    rcf1296 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2007.gheisha_on
    # Minbias for TUP (wide vertex)
    rcf1297 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr13.gheisha_on
    #
    #
    # Central auau200 + D0-mixing, UPGR13
    rcf1298 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr13.gheisha_on
    # Min bias Pythia
    rcf1299 none pyth.dat pp200.pythia6_205.minbias.cdf_a.y2005.gheisha_on
    # Pythia, UPGR13
    rcf1300 none pyth.dat pp200.pythia6_205.charm.cdf_a.upgr13.gheisha_on
    # Pythia wide diamond
    rcf1301 none pyth.dat pp200.pythia6_205.minbias.wide.upgr13.gheisha_on
    # Pythia
    rcf1302 none pyth.dat pp200.pythia6_410.45_55gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1303 none pyth.dat pp200.pythia6_410.35_45gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1304 none pyth.dat pp200.pythia6_410.55_65gev.cdf_a.y2006c.gheisha_on
    # Placeholder XXXXXXXXXXX
    rcf1305 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2007.gheisha_on
    # Pythia
    rcf1306 none pyth.dat pp200.pythia6_410.25_35gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1307 none pyth.dat pp200.pythia6_410.15_25gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1308 none pyth.dat pp200.pythia6_410.11_15gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1309 none pyth.dat pp200.pythia6_410.9_11gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1310 none pyth.dat pp200.pythia6_410.7_9gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1311 none pyth.dat pp200.pythia6_410.5_7gev.cdf_a.y2006c.gheisha_on
    # Pythia CKIN(3)=7, CKIN(4)=9, CKIN(7)=0.0, CKIN(8)=1.0, CKIN(27)=-0.4, CKIN(28)=0.4
    rcf1312 none pyth.dat pp200.pythia6_410.7_9gev.bin1.y2004y.gheisha_on
    # Pythia CKIN(3)=9, CKIN(4)=11, CKIN(7)=-0.4, CKIN(8)=1.4, CKIN(27)=-0.5, CKIN(28)=0.6
    rcf1313 none pyth.dat pp200.pythia6_410.9_11gev.bin2.y2004y.gheisha_on
    # Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=-0.2, CKIN(8)=1.2, CKIN(27)=-0.6, CKIN(28)=-0.3
    rcf1314 none pyth.dat pp200.pythia6_410.11_15gev.bin3.y2004y.gheisha_on
    # Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=-0.5, CKIN(8)=1.5, CKIN(27)=-0.3, CKIN(28)=0.4
    rcf1315 none pyth.dat pp200.pythia6_410.11_15gev.bin4.y2004y.gheisha_on
    # Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=0.0, CKIN(8)=1.0, CKIN(27)=0.4, CKIN(28)=0.7
    rcf1316 none pyth.dat pp200.pythia6_410.11_15gev.bin5.y2004y.gheisha_on
    # Pythia
    rcf1317 none pyth.dat pp200.pythia6_410.4_5gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1318 none pyth.dat pp200.pythia6_410.3_4gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1319 none pyth.dat pp200.pythia6_410.minbias.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1320 none pyth.dat pp62.pythia6_410.4_5gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1321 none pyth.dat pp62.pythia6_410.3_4gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1322 none pyth.dat pp62.pythia6_410.5_7gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1323 none pyth.dat pp62.pythia6_410.7_9gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1324 none pyth.dat pp62.pythia6_410.9_11gev.cdf_a.y2006c.gheisha_on
    # Pythia
    rcf1325 none pyth.dat pp62.pythia6_410.11_15gev.cdf_a.y2006c.gheisha_on
    # Pythia Special 2 (CDF A) 2-3 GeV 6/29/05
    pds1231 none pyth.dat pp200.pythia6_205.2_3gev.cdf_a.y2004y.gheisha_on

    When submitting jobs on the Grid, most of the functionality in alljobs is redundant. The simplified scripts can be found in the "Grid-Friendly" section of this site.

    Merger/filtering script

    Typically, a Starsim run will result in an output which is a file, or a series of files with names like gstar.1.fz, gstar.2.fz etc. Regardless of whether we run locally or on the Grid, there is a small chance that the file(s) will be truncated. To guard against the possibility of feeding up incorrect data to the reconstruction stage, and/or performing a split or merger of a few file, a KUMAC script has been developed. It will, among other things, discard incomplete events, and produce serially numbered files with names like rcf1319_01_100evts.fzd, which contains the name of the dataset, the serial number of the file (distinct from the numbering of the input files), and the number of events contained therein, all of which is helpful in setting up or debugging the production. It has recently been simplified (although still not easily readable), and wrapped into a utility shell script, which does preparation work as well as cleanup. The resulting script, named "filter.tcsh", takes a single argument which is assumed to be the name of the dataset (and which is then used in naming the output files).

    #! /usr/local/bin/tcsh -f
    #
    # remove the old list of files
    if( -e process.list ) then
    rm process.list
    endif
    #
    if( -e filter.kumac ) then
    rm filter.kumac
    endif
    ls gstar.*.fz | sed -e 's/[gstar.|.fz]//g' | sort -n > process.list
    #
    # clean the trash bin before the next run, re-create
    rm -fr trash
    mkdir trash
    echo `du --block-size=1000K -s | cut -f1` MB in the current directory
    echo `df --block-size=1000K . | tail -1 | sed -e 's/\ *[0-9]*\ *[0-9]*\ *//' | sed -e 's/\ .*//g'` MB available on disk
    cat<<EOF>>filter.kumac
    macro filter name
    input='gstar'
    mess Start with filenames [input].*.fz, converting to [name]
    ag/version batch
    option stat
    option date
    option nbox
    filecase keep
    pwd =\$shell('pwd');
    nfiles=\$shell('cat process.list | wc -l | sed -e "s/[\ ]*//g"');

    message Starting to process [nfiles]
    * trace on
    ve/cr runs([nfiles]) I
    ve/read runs process.list
    ve/pri runs

    if (\$Len([name]).eq.0) then
    message cannot define current directory in [pwd]
    exit
    endif
    namz=[name]
    out =\$env('OUTDIR')
    if ([out].ne.'') then
    namz = [out]/[name]/[name]
    endif

    lenb = 1000
    message reading
    ve/cr id(3) I
    * ve/read id N
    message reading complete
    nt=[nfiles] | total number of files to process
    n1=runs(1) | first input file
    n2=runs([nfiles]) | last input file
    mm = 0 | number of output files
    nn = 0 | number of processed files
    cnt = 0 | total number of events in this job
    cno = 0 | number of events when output has been opened
    nev = 0 | number of events in this output
    ii = 0 | input active flag
    io = 0 | output active flag
    len0= 1200 | minimum output file len
    len1= [len0]+200 | average output file len - stop at end-of-file
    len2= [len1]+200 | maximum output file len - stop always
    ni = [n1] | first input file
    no = 0 | skip up to this file
    nd = [n1] | file to delete
    ntrig = 10
    *
    if (\$fexist(nn).gt.0) then
    ve/read id nn
    na=id(1); message [na] input files already done
    no=id(2); message first input files up to gstar.[no]
    mm=id(3); message first output files up to [name].[mm]
    mm=[mm]-1;
    endif
    *
    hist = [name].his
    if (\$fexist([hist]).gt.0) then
    shell mv [hist] old.his
    * call HRGET(0,\$quote([hist]),' ')
    endif
    ghist [hist]
    cdir //pawc
    mdir cont
    if (\$fexist(old.his).gt.0) then
    call HRGET(0,\$quote(old.his),' ')
    endif

    gfile p gstar.[n1].fz
    mode control prin 1 hist 0 | simu 2
    gexec ../.lib/control.sl
    gexec ../.lib/index.sl

    message loaded libs

    title=merging runs [n1]-[n2] in [name]
    fort/file 66 [name].ps; meta 66 -111
    next; dcut cave x .1 10 10 .03 .03
    Set DMOD 1; Igset TXFP -60; Igset CHHE .35
    ITX 5 19.5 \$quote([title])
    ITX .5 .1 \$quote([pwd])
    *
    * do ni = [ni],[n2]
    frst=1
    ag/version interactive
    do iev=1,1000000000000
    * new input file ?
    if ([ii].eq.0) then
    do nfoo=[frst],[nfiles]
    ni = runs([nfoo])

    file = [input].[ni].fz
    filz = [input].[ni].fz.gz
    hist = [input].[ni].his
    message processing index [nfoo] out of [nfiles]
    ve/print runs([nfoo])
    *
    if (\$fexist([file]).gt.0) then
    message loop with [file]
    gfile p [file]
    if (\$iquest(1).eq.0) then
    ii = 1
    nn = [nn]+1
    if (\$fexist([hist]).gt.0) then
    if (\$hexist(-1).eq.0) then
    call HRGET(0,\$quote([hist]),' ')
    else
    call HRGET(0,\$quote([hist]),'A')
    endif
    endif
    call indmes(\$quote([file]))
    goto nextf
    * iquest:
    endif
    * fexist:
    endif
    enddo
    goto nexto
    endif

    nextf:
    * new output file ?
    if ([io].eq.0) then
    mm = [mm]+1
    if ([mm].lt.10) then
    output=[namz]_0[mm]
    else
    output=[namz]_[mm]
    endif
    io = 1
    cno = [cnt]
    gfile o [output].fzt
    iname = [name]_[mm].fzt
    call indmes(\$quote([iname]))
    endif

    * processing next event
    call rzcdir('//SLUGRZ',' ')
    trig [ntrig]
    evt = \$iquest(99)

    if (\$iquest(1).ne.0) then
    ni = [ni]+1
    frst=[frst]+1
    ii = 0
    endif
    if ([ii].eq.0) goto nexto
    * get output file length in MB:
    cmd = ls -s [output].fzt
    len = \$word(\$shell([cmd]))
    len = [len]/[lenb]
    * mess wrquest len=[len] ii=[ii] evt=[evt]
    if ([len].lt.[len0]) goto nextev
    if ([len].lt.[len1] .and. [ii].gt.0) goto nextev
    if ([len].lt.[len2] .and. [ii].gt.0 .and. [evt].eq.0) goto nextev
    * output file done
    nexto:
    cnt = \$iquest(100)
    if ([cnt]<0) then
    cnt = 0
    endif
    nev = [cnt]-[cno]
    io = 0
    *
    if ([nev].gt.0) then
    if ([nev].lt.199999) then
    * terminate last event, clear memory
    call guout
    call gtrigc
    gfile o
    * rename temp file into the final one:
    cmv = mv [output].fzt [output]_[nev]evts.fzd
    i = \$shell([cmv])
    endif
    endif
    message files inp = [ni] out = [mm] cnt = [cnt] done
    *
    if ([ii].eq.0) then
    nj = [ni] - 1 | this file was finished, ni is NEXT to read
    mj = [mm] + 1 | this is next to start write after the BP
    message writing breakpoint [nn] [ni] [mj]
    ve/inp id [nn] [ni] [mj]
    ve/write id nn i6
    ntrig = 10
    ************************************
    * moving files to TRASH
    while ([nd].lt.[ni]) do
    filed = [input].[nd].fz
    alrun = *.[nd].*
    if (\$fexist([filed]).gt.0) then
    shell mv [alrun] trash/
    endif
    nd = [nd] + 1
    endwhile
    ************************************
    else
    ntrig = [ntrig] + 1
    endif
    if ([ni].gt.[n2]) goto alldone
    nextev:
    enddo

    * control histogram
    alldone:
    if ([nn].eq.[nt]) then
    shell touch filter.done
    endif
    cdir //pawc
    tit = files [n1] - [n2] in set [name]
    title_global \$quote([tit])
    next; size 20.5 26; zone 2 4;
    hi/pl 11; hi/pl 12; hi/pl 13; hi/pl 14
    if (\$hexist(1).gt.1) then
    n/pl 1.ntrack; n/pl 1.Nvertx; n/pl 1.NtpcHit; n/pl 1.Ntr10
    endif
    swn 111 0 20 0 20; selnt 111
    ITX 2.0 0.1 \$quote([pwd])
    close 66; meta 0
    physi
    exit
    return
    EOF
    echo ------------------------------------------------------------------
    echo Activating starsim for dataset $1
    $STAR_BIN/starsim -w 1 -g 40 -b ./filter.kumac $1
    # cleanup
    rm ZEBRA.O process.list nn index paw.metafile *.his *.ps filter.done filter.kumac

    Running STARSIM within root4star

    New event generators are now being implemented in a C++ framework, enabling us to run simulations within the standard STAR, ROOT-based production chain.  Running these generators requires us to migrate away from the familiar starsim interface and begin running simulations in root4star.  Several example macros have been implemented, showing how to run various event generators and how to produce samples of specific particles.  This HOWTO guide illustrates one of these examples.

    First, obtain the example macro by checking it out from cvs:
    $ cvs co StRoot/StarGenerator/macros
    $ cp StRoot/StarGenerator/macros/starsim.kinematics.C .
    Running the macro is straightforward.  To generate 100 events, simply do...
    $ root4star
    root [0] .L starsim.kinematics.C
    root [1] int nevents = 100
    root [2] starsim(nevents)
    
    This will create an "fzd" file, which can be analyzed with the bfc.C macro as you normally would.

    If you're happy with 9 muons per event, thrown with a funky pT and eta distribution, run with the 20012 geometry... then you can use the macro as is.  Otherwise, you'll want to modify things to suit your needs.  Open the macro in your favorite editor (i.e. emacs or vi).  In the "starsim" function, somewhere around line 108, you should see the following lines:
      geometry("y2012");
      command("gkine -4 0");
      command("gfile o pythia6.starsim.fzd");


    If you're familiar with the starsim interface, you probably recognize the arguements to the command function.  These are KUIP commands used to steer the starsim application.  You can use the gfile command to set the name of the output file, for example.  The "gkine -4 0" command tells starsim how it should get the particles from the event generator (this shouldn't be changed.)  Finally, the geometry function defined in the macro allows you to set the geometry tag you wish to simulate.  It is the equivalent of the "DETP geom" command in starsim.  So you may also pass magnetic field, switch on/off hadronic interactions, etc.  Any command which can be executed in starsim can be executed using the "command" function.  This enables full control of the physical model, the ability to print out hits, materials, etc... and setup p

    Let's take a quick look at the "KINE" event generator and how to configure it.  StarKinematics is a special event generator, allowing us to inject particles into the simulation on an event-by-event basis during a simulation run.  The "trig" function in this macro loops over a requested number of events, and pushes particles.  Let's take a look a this function.
     

    void trig( Int_t n=1 )
    {
      for ( Int_t i=0; i<n; i++ ) {
    
        // Clear the chain from the previous event
        chain->Clear();
    
        // Generate 1 muon in the FGT range
        kinematics->Kine( 1, "mu-", 10.0, 50.0, 1.0, 2.0 );
    
        // Generate 4 muons flat in pT and eta 
        kinematics->Kine(4, "mu+", 0., 5., -0.8, +0.8 );
    
        // Generate 4 muons according to a PT and ETA distribution
        kinematics->Dist(4, "mu-", ptDist, etaDist );
    
        // Generate the event
        chain->Make();
    
        // Print the event
        primary->event()->Print();
      }
    }

    The "kinematics" is a pointer to a StarKinematics object. There are three functions of interest to us:

    • Kine -- Throws N particles of specified type flat in pT, eta and phi
    • Dist -- Throws N particles of specified type according to a pT and eta distribution (and optionally phi distribution) defined in a TF1.
    • AddParticle -- Creates a new particles and returns a pointer to it.  You're responsible for setting the identity and kinematics (px, py, pz, etc...) of the particle.
    In the example macro, we generate a single muon thrown flat in pT from 10 to 50 GeV, and 1 < eta < 2.  We add to that 4 muons thrown flat 0 < pT < 5 GeV  and |eta|<0.8.  And 4 more muons according to pT and eta distributions defined elsewhere in the code.  After calling "Make" on the big full chain, we print out the resulting event.

    Example event record --
    [   0|   0|  -1] id=         0    Rootino stat=-201 p=(   0.000,   0.000,   0.000,   0.000;  510.000) v=(  0.0000,  0.0000,   0.000) [0 0] [0 0]
    [ 1| 1| 1] id= 13 mu- stat=01 p=( 36.421, -7.940, 53.950, 65.576; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 2| 2| 2] id= -13 mu+ stat=01 p=( -2.836, 3.258, 0.225, 4.326; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 3| 3| 3] id= -13 mu+ stat=01 p=( -1.159, -4.437, -2.044, 5.022; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 4| 4| 4] id= -13 mu+ stat=01 p=( -0.091, 1.695, -0.131, 1.706; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 5| 5| 5] id= -13 mu+ stat=01 p=( 1.844, -0.444, 0.345, 1.931; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 6| 6| 6] id= 13 mu- stat=01 p=( 4.228, -4.467, -3.474, 7.065; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 7| 7| 7] id= 13 mu- stat=01 p=( -0.432, -0.657, 0.611, 1.002; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 8| 8| 8] id= 13 mu- stat=01 p=( -0.633, -0.295, -0.017, 0.706; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
    [ 9| 9| 9] id= 13 mu- stat=01 p=( 2.767, 0.517, 1.126, 3.034; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]

    The printout above illustrates the STAR event record.  Each row denotes a particle in the simulation.  The 0th entry (and any entry with a status code -201) is used to carry summary information about the configuration of the event generator.  Multiple event generators can be run in the same simulation, and a Rootino is introduced into the event record to summarize their configuration.  The three columns at the left hold the primary event id, the generator event id, and the idtruth id.  The next column shows the PDG id of the particle, followed by the particle's name.  The particle's staus code is next, followed by the 4-momentum and mass, and the particle's start vertex.  Finally, the last four columns denote the primary ids of the 1st and last mother particle and the 1st and last daughter particle.

    The STAR event record is saved in a ROOT file at the end of the simulation run, allowing you to read back both particle-wise and event-wise information stored from the event generator and compare with reconstructed events.  Here, the idtruth    ID of the particle is useful, as it allows you to compare reconstructed tracks and hits with the particle which generated them.

    Simulation HOWTOS

    STARSIM is the legacy simulation package in STAR, implemented in FORtran, MORtran and utilizing GEANT3 as the concrete MC package.    For documentation on how to run simulations using STARSIM, see

    The simulation group is evolving the framework towards using Virtual Monte Carlo.  As a first step, we have implemented a new event generator framework which will be compatible with the future VMC application.  The new framework allows us to run jobs within root4star.  In order to run simulations in the new framework, see

    • Running STARSIM within root4star

    StMc package

    ,,,

    The STAR Simulation Framework

    Outline

    1. Introduction
    2. Quick Start
    3. Primary Event Generation
    4. Geometry Description
    5. Particle Transport and Detector Simulation
    6. Digitization / Slow Simulation
    7. The Truth about Event Records
    8. Reconstruction

    Introduction


    STAR's simulation infrastructure is built around the GEANT 3 simulation package.  It is implemented in FORtran, augmented by the MORtran-based AgSTAR preprocessor language, with a ROOT and C++ interface which allows it to be integrated with our standard "big full chain" software.  The legacy documentation pages describe how to run starsim as a standalone application.  The purpose of this document is to describe how to run simulations using the modern C++ interface, and to serve as a starting point for extending the simulation infrastructure with additional geometry modules, event generators, and slow simulators.

    This document assumes familiarity with:
    1. C++
    2. ROOT
    3. The STAR Framework

    Quick Start

    1. Your First Simulation Job
    2. Reconstructing the Events
    3. idTruth and qaTruth
    4. Taking it Further
    Running your analysis code over Monte Carlo data samples is generally a three step process.  First you'll need to generate your Monte Carlo events by running the simulation package.  Then you'll pass these events through the reconstruction software to convert MC hits to "real" ones, then perform track reconstruction, matching to TOF/MTD/etc... and creating calorimeter hits.  Finally you will want to run your analysis code on the output, possibly even looking at the Monte Carlo Truth tables in order to better understand the efficiency, purity and resolution of your reconstruction codes.

    A. Your First Simulation Job


    We're going to start with a simple example:  starsim.kinematics.C.  This is an example macro which runs under root4star, and generates a sample of muons and neutral pions.  Start by checking the code out from CVS. 
    $ cvs co StRoot/StarGenerator/macros/starsim.kinematics.C                 # check out the macro
    $ ln -s StRoot/StarGenerator/macros/starsim.kinematics.C starsim.C        # create a link named starsim.C
    $ root4star starsim.C                                                     # run the code using STAR's version of ROOT
    
    You're going to see alot of output here, but in the end you'll have two output files:  starsim.kinematics.root and starsim.kinematics.fzd
    $ ls
    starsim.kinematics.fzd
    starsim.kinematics.root
    starsim.C
    StRoot/

    These two files are the so called "zebra" file (.fzd), containing the Monte Carlo hits, geometry and other associated event information, and the event generator record (.root), containing a ROOT TTree which saves all of the particles generated by the primary event generator. 

    B. Reconstructing the Events


    Once we have the output files, it's time to run them through the reconstruction chain.  STAR's reconstruction code is steered using the "big full chain" macro bfc.C.  For most jobs you'll want to provide BFC three arguements:  the number of events to produce, the set of chain options to run, and an input file.  For more complicated tasks you're encouraged to ask questions on the STAR software list.

    Below you'll find an example macro which runs the big full chain.  We're going to run a limited set of STAR software to begin with.  Start by looking at line 12.  This is where the big full chain is called.  As I noted above, it takes three arguements.  The first is the number of events to process... coincidentally, 10 is the number of events which starsim.kinematics.C produces by default.  Then it takes two strings as input.  The first is the set of chain options we want to run.  It's a long list, so I've broken things down across several lines. 

    Line 5 specifies the geometry tag to use.  Generally the format is the letter "r" followed by a valid STAR Geometry in simulation & reconstruction.
    Line 6 specifies the geometry model to use.  Generally you need to specify both "agml" and "UseXGeom" here.  (More on this later).  BTW, did you notice that capitalization is ignored? 
    Line 7 tells the big full chain that the input will be a zebra file.  This is our standard for now.
    Line 8 sets up the TPC simulator.  We perform our digitization of the geant hits in the reconstruction chain.  This is where the TPC hits are converted to ADC values used as input to the reconstruction codes.
    Line 9 sets up the track finding and reconstruction codes.  We're using "sti" and "ittf" here.
    Line 10 most STAR analyses use the micro DST.  This flag creates it.
    Line 11 the "fzd" file contains an event record, which associates MC hits with generated tracks.  This will allow us to (much later) associate the reconstructed tracks with the true Monte Carlo particles from which they came.
    $ emacs runBfc.C                    # Feel free to use your favorite editor instead of emacs
    0001 | void runBfc() {
    0002 |   gROOT->LoadMacro("bfc.C");                  // Load in BFC macro
    0003 |   TString _file = "kinematics.starsim.fzd";   // This is our input file
    0004 |   TString _chain;                             // We'll build this up
    0005 |   _chain += "ry2012a ";                       // Start by specifying the geometry tag (note the trailing space...)
    0006 |   _chain += "AgML USExgeom ";                 // Tells BFC which geometry package to use.  When in doubt, use agml.
    0007 |   _chain += "fzin ";                          // Tells BFC that we'll be reading in a zebra file.
    0008 |   _chain += "TpcFastSim ";                    // Runs TPC fast simulation
    0009 |   _chain += "sti ittf ";                      // Runs track finding and reconstruction using the "sti" tracker
    0010 |   _chain += "cmudst ";                        // Creates the MuDst file for output
    0011 |   _chain += "geantout ";                      // Saves the "geant.root" file
    0012 |   bfc(10, _chain, _file );                    // Runs the simulation chain
    0013 | }
    ctrl-x ctrl-s ctrl-x ctrl-q          # i.e. save and quit
    $ root4star runBfc.C                 # run the reconstruction job
    $ ls -l
    ...
    
    

    If all has gone well, you now have several files in your directory including the MuDst which you'll use in your analysis.

    $ ls -1 *.root
    kinematics.geant.root
    kinematics.hist.root
    kinematics.MuDst.root
    kinematics.runco.root
    kinematics.starsim.root
    


    C. idTruth and qaTruth

    During the first phase of the simulation job we had full access to the state of the simulated particles at every step as they propagated through the STAR detector.  As particles propagate through active layers, the simulation package can register "hits" in those sensitive layers.  These hits tell us how much energy was deposited, in which layer and at what location.  They also save the association between the particle which deposited the energy and the resulting hit.  This association is saved as the "idTruth" of the hit.  It corresponds to the unique id (primary key) assigned to the particle by the simulation package.  This idTruth value is exceedingly useful, as it allows us to compare important information between reconstructed objects and the particles which are responsible for them.

    Global and Primary tracks contain two truth variables:  idTruth and qaTruth.  idTruth tells us which Monte Carlo track was the dominant contributor (i.e. provided the most TPC hits) on the track, while qaTruth tells us the percentage of hits which thath particle provided.  With idTruth you can lookup the corresponding Monte Carlo track in the StMuMcTrack branch of the MuDst.  In the event that idTruth is zero, no MC particle was responsible for hits on the track. 
    With the MC track, you can compare the thrown and reconstructed kinematics of the track (pT, eta, phi, etc...).

    Primary vertex also contains an idTruth, which can be used to access the Monte Carlo vertex which it corresponds to in the StMuMcVertex branch of the MuDst.

    D. Taking it further

    In starsim.kinematics.C we use the StarKinematics event generator, which allows you to push particles onto the simulation stack on an event-by-event basis.  You can throw them flat in phase space, or sample them from a pT and eta distribution.  These methods are illustrated in the macro, which throws muons and pions in the simulation.  You can modify this to suit your needs, throwing whatever particles you want according to your own distribtions.  The list of available particles can be obtained from StarParticleData.
     

    $ root4star starsim.C\(0\)
    root [0] StarParticleData &data = StarParticleData::instance();
    root [1] data.GetParticles().Print()
    

    Additionally, you can define your own particles.  See starsim.addparticle.C.


    Primary Event Generation

    Geometry Definition

    Running the Simulation

    Event Reconstruction

    The Truth about Event Records

    StMc package

       StMc package create StMcEvent structure and fills it by Monte-Carlo information. Then  create StMiniMcEvent structure
    which contains both, MonteCarlo & Reconstruction information. Then provide matching MonteCarlo & Reconstruction info.
    It allows user to estimate quality of reconstruction and reconstruction efficiency for different physical processes.
    Actually, StMcEvent is redundunt, and exists by historical reasons.
    StMc consists of:

    • StMcEvent - structure with MonteCarlo(Simu) information;
    • StMiniMcEvent - structure with both Simu & Reco info;
    • StMcEventMaker - maker creates and fill StMcEvent Simu info;

    Attic

    Archive of old Simulation pages.

    Event Generators, event mixing

    B0/B+ simulation and event mixing

    Decays

    • weight 28% : B0-> D- + (e+) + (nu)
    • weight 72% : B0-> D*(2010) + e + nu
      • D*(2010) -> (D0) + (pi+) b.r.69%
      • D*(2010) -> (D+) + (pi0) b.r.31% neglect D*->gamma
    • weight 25% : B+ -> (D0bar) + (e+) + nu
    • weight 75% : B+ -> D*bar(2007) + (e+) + nu
      • D*(2007) -> D0+ (pi0) b.r.62%
      • D*(2007) -> D0+ (gamma) b.r.38%

    Hijing

    To use Hijing for simulation purposes, one must first run Hijing proper and generate event files, then feed these data to starsim to complete the GEANT part.

    The Hijing event generator codes and makefile can be found in the STAR code repository at the following location:$STAR/pams/gen/hijing_382. Once built, the executable is named hijjet.x. The input file is called hijev.inp and should be modified as per user's needs. When the executable is run multiple times in same directory, a series of files will be produced with names like evgen.XXX.nt, where XXX is an integer. The format of the file is PAW Ntuple. The starsim application is equipped to read that format as explained below. If a large number of events are needed, a request should be made to the STAR simulation leader or any member of the S&C.

    Listed below is the KUMAC macro that can be used to run your own GEANT simulation with pre-fabricated Hijing events . Unlike the Pythia simulation, events aren't generated on the fly but are read from an external file instead. Look at the comments embedded in the code. Additional notes:

    • don't forget to seed the random number generator if you'll be doing a series of runs
    • make sure you specify the correct geometry tag
    • specify a different output file for each run
    • the location of the input file (current directory) and the name (evgen.1.nt) are given as an example
    • you can browse the directory /star/simu/evgen to look at what input Hijing files are already available
    • the number of triggers on the bottom of the macro can be set to anything, just remember that the resulting files can be large and unwieldy if that number is too large. As a rule of thumb, we usually don't go over 500 events per file in production for min-bias AuAu, and 100 event for central gold
    gfile o my_hijing_file.fz
    detp geom y2006
    make geometry
    gclose all
    * define a beam with 100um transverse sigma and 60cm sigma in Z
    vsig  0.01  60.0
    * introduce a cut on eta to avoid having to handle massive showers caused by spectators
    gkine -1 0 0 100 -6.3 6.3 0 6.3 -30.0 30.0
    gexec  $STAR_LIB/gstar.so
    us/inp hijing evgen.1.nt
    * seed the random generator
    rndm 13 17
    * trigger - change to trigger the desired number of times
    trig 10
    

    Pythia

    Introduction

    There are two ways, which are slightly different, to run the Pythia event generator in the context of the Starsim application. In the original (old) design, the dynamic library apythia.so served both as an adapter and a container for the standard Pythia library that would typically come with a CERNLIB distribution. The problem with this approach is of course that Pythia in itself is not a static piece of software and receives periodic updates. It is difficult or impossible, therefore, to modify the apythia.so component without affecting, in one way or another, various analyses as the consistency of the code is broken at some point.

    It possible, however, to refactor the Pythia adaptor in such a way that the Pythia library proper can be loaded separately. This gives the user the ability to choose a specific version of the Pythia code to be run in their simulation. Different users, therefore, can use different versions of Pythia concurrently in Starsim, which is in everybody's interest. The thus modified wrapper was given the mneumonic name bpythia.so (it should be easy to memorize since "b"-pythia follows "a"-pythia). We have also decided the freeze the Pythia version linked into apythia.so at 6.205, and select subsequent versions bpythia.so as explained on the bottom of this page.

    In the following, we present both the "old way" of running Pythia (i.e. tied to a specific version) and the new one (whereby the version can be requested dynamically at run time).

    Using Pythia 6.205

    Listed below is the KUMAC macro that can be used to run your own Pythia simulation, specifically utilizing version 6.205 of the Pythia code and without the ability to switch. This would be fine for most STAR applications at the time of this writing (mid-2007). Please pay attention to the comments embedded in the code. Additional notes:
    • the script below explicitely refers to apythia.so which contains Pythia 6.205
    • don't forget to seed the random number generator if you'll be doing a series of runs
    • make sure you specify the correct geometry tag
    • specify a different output file for each run
    • pay attention to the physics parameters used in the simulation; you will need to consult the Pythia manual for meaning fo those
    • the number of triggers on the bottom of the macro can be set to anything, just remember that the resulting files can be large and unwieldy if that number is too large. As a rule of thumb, we usually don't go over 5k event per file in production
    gfile o my_pythia_file.fz
    detp geom y2006
    make geometry
    gclose all
    * define a beam with 100um transverse sigma and 60cm sigma in Z
    vsig  0.01  60.0
    * Cut on eta (+-6.3) to avoid having to handle massive showers caused by the spectators
    * Cut on vertex Z (+-30 cm)
    gkine -1 0 0 100 -6.3 6.3 0 6.29 -30.0 30.0
    * load pythia
    gexec $STAR_LIB/apythia.so
    * specify parameters
    ENER 200.0     ! Collision energy
    MSEL 1         ! Collision type
    MSUB (11)=1    ! Subprocess choice
    MSUB (12)=1
    MSUB (13)=1
    MSUB (28)=1
    MSUB (53)=1
    MSUB (68)=1
    *
    * Make the following stable:
    *
    MDCY (102,1)=0  ! PI0 111
    MDCY (106,1)=0  ! PI+ 211
    *
    MDCY (109,1)=0  ! ETA 221
    *
    MDCY (116,1)=0  ! K+ 321
    *
    MDCY (112,1)=0  ! K_SHORT 310
    MDCY (105,1)=0  ! K_LONG 130
    *
    *
    MDCY (164,1)=0  ! LAMBDA0 3122
    *
    MDCY (167,1)=0  ! SIGMA0 3212
    MDCY (162,1)=0  ! SIGMA- 3112
    MDCY (169,1)=0  ! SIGMA+ 3222
    MDCY (172,1)=0  ! Xi- 3312
    MDCY (174,1)=0  ! Xi0 3322
    MDCY (176,1)=0  ! OMEGA- 3334
    * seed the random generator
    rndm 13 19
    * trigger - change to trigger the desired number of times
    trig 10
    

    Specifying the Pythia version dynamically

    In addition to the "frozen" version 6.205 which can be used as explained above, there is currently one more version that can be loaded, namely 6.410. Going forward, more versions will be added to the code base and to the collection of STAR libraries, as needed.

    In order to use version 6.410, the user needs to simply replace the following line in the above script
    gexec $STAR_LIB/apythia.so
    
    With:
    gexec $STAR_LIB/libpythia_6410.so
    gexec $STAR_LIB/bpythia.so
    

    The Magnetic Monopole in STAR

    Introduction

    It is possible to simulate the production and propagation of the magnetic monopoles in the STAR experiment, using a few modification in the code base of GEANT 3.21, and in particular in our GEANT-derived application, the starsim. Our work is based on a few papers, including:

    The flow of the GEANT code execution is illustrated by the following diagrams from the above publication:

     

     

     

    First Results

    As as demonstration of principle, we present here a few Starsim event display pictures. First, we propagate 12 magnetic monopoles of varying momenta, in the STAR detector:

     

     

    Now, let's take a look at a minimum bias gold-gold event that contains a pair of magnetic monopoles:

     

     

    Salient features can already be seen in these graphics: large dE/dx losses and characteristic limit on the maximum radius of the recorded monopole track (this is due to the fact that the trajectory of the mm is not helix-like, but rather parabole-like). Now, lets take a look at the phi distribution of the hits, for central and peripheral gold-gold events containing monopoles:

     

     

     

    Again, the rather intuitive feature (large peaks in phi due to a very large dE/dx produced by the monopoles) is obviously borne out in the simulation.

     

    This is work in progress and this page is subjec to updates.

    Grid-friendly Starsim production scripts

    Since the production activity of STAR is migrating to, and eventually will end up running mostly in the Grid environment, this necessitates modification (which often means simplification) of the production scripts we use when running on a local or another "traditional" Unix farm facility. Here is an example of the script we have successfully used to run a Pythia simulation on the Grid (utilizing the Fermilab facility), as well as the SunGrid, with cosmetic modifications.

    A few things worth noting:

    • The bulk of the script has to do with establishing the Pythia settings which are often required in the simulations requested by the Spin PWG; the starsim proper part is located on top is is uncomplicated; it invloves dynamic loading of the necessary libraries, setting up the beam interaction diamond parameters ets
    • The script needs the contents of the tarball (listed on the bottom of the page) located in its working directory; this "payload" contains the Starsim executable as well as a few shared libraries and accessory scripts necessary for its function. To be able to run on the Grid, therefore, on needs to
      • Transfer the tarball and make provisions for extraction of the files
      • Transfer the script below and configure it for submission with a unique serial number (any integer, really)
    • The script takes only one argument, which is the serial number of the run. The rest of the run parameters are encoded in its body, which minimizes the chances of human error when submitting a large number of scripts, potentially for many different datasets
    • The random number generator is seeded with the serial run number and with the Unix process ID of the script on the target machine, which for all intents and purposes guarantees the uniqueness of a sequence in each run
    #!/usr/bin/ksh
    echo commencing the simulation
    export STAR=.
    echo STAR=$STAR
    #
    run=$1
    geom=Y2006C
    ntrig=2000
    diamond=60
    z=120
    # >> run.$run.log 2>&1
    node=`uname -n`
    echo run:$run geom:$geom ntrig:$ntrig diamond:$diamond z:$z node:$node pid:$$
    ./starsim -w 0 -g 40 -c trace on .<<EOF
    trace on
    RUNG $run 1 $$
    RNDM $$ $run
    gfile o gstar.$run.fz
    detp geom $geom
    vsig 0.01 $diamond
    gexec $STAR/geometry.so
    gexec $STAR/libpythia_6410.so
    gexec $STAR/bpythia.so
    gclose all
    gkine -1 0 0 100 -6.3 6.3 0 6.28318 -$z $z
    ENER 200.0
    MSEL 1
    CKIN 3=4.0
    CKIN 4=5.0
    MSTP (51)=7
    MSTP (81)=1
    MSTP (82)=4
    PARP (82)=2.0
    PARP (83)=0.5
    PARP (84)=0.4
    PARP (85)=0.9
    PARP (86)=0.95
    PARP (89)=1800
    PARP (90)=0.25
    PARP (91)=1.0
    PARP (67)=4.0
    MDCY (102,1)=0 ! PI0 111
    MDCY (106,1)=0 ! PI+ 211
    MDCY (109,1)=0 ! ETA 221
    MDCY (116,1)=0 ! K+ 321
    MDCY (112,1)=0 ! K_SHORT 310
    MDCY (105,1)=0 ! K_LONG 130
    MDCY (164,1)=0 ! LAMBDA0 3122
    MDCY (167,1)=0 ! SIGMA0 3212
    MDCY (162,1)=0 ! SIGMA- 3112
    MDCY (169,1)=0 ! SIGMA+ 3222
    MDCY (172,1)=0 ! Xi- 3312
    MDCY (174,1)=0 ! Xi0 3322
    MDCY (176,1)=0 ! OMEGA- 3334
    trig $ntrig
    exit
    EOF

    The contents of the "payload" tarfile:

    143575 2007-05-31 18:02:47 agetof
    65743 2007-05-31 18:02:39 agetof.def
    44591 2007-05-31 19:05:34 bpythia.so
    5595692 2007-05-31 18:03:10 geometry.so
    183148 2007-05-31 18:03:15 gstar.so
    4170153 2007-05-31 19:05:27 libpythia_6410.so
    0 2007-05-31 18:00:06 StarDb/
    0 2007-05-31 18:00:59 StarDb/StMagF/
    51229 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_negative_2D.dat
    2775652 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_negative_3D.dat
    51227 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_positive_2D.dat
    2775650 2007-05-31 18:00:58 StarDb/StMagF/bfield_full_positive_3D.dat
    51227 2007-05-31 18:00:58 StarDb/StMagF/bfield_half_positive_2D.dat
    2775650 2007-05-31 18:00:58 StarDb/StMagF/bfield_half_positive_3D.dat
    1530566 2007-05-31 18:00:59 StarDb/StMagF/boundary_13_efield.dat
    51231 2007-05-31 18:00:59 StarDb/StMagF/const_full_positive_2D.dat
    1585050 2007-05-31 18:00:59 StarDb/StMagF/endcap_efield.dat
    1530393 2007-05-31 18:00:59 StarDb/StMagF/membrane_efield.dat
    15663993 2007-05-31 18:03:31 starsim
    36600 2007-05-31 18:03:37 starsim.bank
    1848 2007-05-31 18:03:42 starsim.logon.kumac
    21551 2007-05-31 18:03:48 starsim.makefile

    Production overview

    As of spring of 2007, the Monte Carlo production is being run on three different platforms:

    • the rcas farm
    • Open Science Grid
    • SunGrid

     

    Miscellaneous scripts

    a

    VMC

    VMC C++ Classes

    StarVMC/StarVMCApplication:

    • StMCHitDescriptor
    • StarMCHits
      • Step
    • StarMCSimplePrimaryGenerator

     

    Example of setting the input file: StBFChain::ProcessLine ((StVMCMaker *) 0xaeab6f0)->SetInputFile("/star/simu/simu/gstardata/evgenRoot/evgen.3.root");

    In general, StBFChain sets various attributes of the makers.

     

    New chain options must be added in BigFullChain.h