Welcome to the STAR Embedding Pages!
Embedding data are generally used in STAR experiments for detector acceptance & reconstruction efficiency study. In general, the efficiency depends on running conditions, particle types, particle kinematics, and offline software versions. In principle, each physics analysis will need to formulate its own embedding requests by providing all the above relevant information. In STAR Collaboration, the embedding team is assigned to process those embedding requests and provide these embedding data, you can find out how the embedding team works in the Embedding structure page.
Over the past years, lots of embedding requests from different PWG's have been processed by the embedding team, please find the list in STAR Simulations Requests interface. If you want to look at some of these data, but do not know where these data are stored, please go to this page. If you can not find similar embedding request, you need to formulate your own embedding request for your particular study, please go to this page for more information about how to formulate an embedding request.
Please subscribe to the embedding mailing list if you are interested with embedding discussion: Starembd-l@lists.bnl.gov
And please join our weekly embedding meeting : https://drupal.star.bnl.gov/STAR/node/65992
Embedding data were produced for each embedding request in the STAR Simulations Request page.
Normally, they will be stored in RCF NFS disks for a while for end users to do their analysis.
However, NFS disk space is very limited, and we have new requests constantly, the data will be finally moved
from disks to HPSS for permanent storage on tape, but they can be restaged to disk for analysis later.
In order to find the existing embedding either on disks or in HPSS. Please follow the procedures below:
1) Find the request ID of a particular request that you are interested in, in the STAR Simulations Request page.
You can use the "Search" box at the top right of this page. Once you find the entry, look at the 'Request History' tab for more information, usually the original NFS data directory (when the data was first produced) can be found there.
2) Currently, the RCF NFS disks for embedding data are /star/data105, /star/embed and /star/data18.
For the data directories of each request, please logon to RCF, look at '/star/data105', '/star/embed' or '/star/data18' to see whether the following directories exist:
/star/data105/embedding/${Trigger set name}/${Embeded Particle}_${fSet}_${RequestID}/
/star/embed/embedding/${Trigger set name}/${Embeded Particle}_${fSet}_${RequestID}/
/star/data18/embedding/${Trigger set name}/${Embeded Particle}_${fSet}_${RequestID}/
3) If they exist, you can further check whether all the ${fSet} from 'fSet min' to 'fSet max' are there. If all exist,
you can start to use it. If none of them are there or only some fraction of 'fSet' are there, write to the embedding list
and ask the Emebedding Coordinator to restage it to there from HPSS.
4) If you can not find the embedding data in this directory, the data must be in HPSS. Unfortunately, there is not a full list of data stored in HPSS yet. (For some data produced at RCF, Lidia maintains the list of embedding samples produced at RCF.) Please write to the embedding list, provide the request ID, Particle Name, file type (minimc.root, MuDst.root or event/geant.root), and ask the Embedding Coordinator to restage the data to NFS disk for you.
The full list of STAR embedding requests (Since August, 2010):
http://drupal.star.bnl.gov/STAR/starsimrequest
The operation status of each request can be viewed in its page and history.
The information below (and in sub-pages) are only valid to OLD STAR embedding requests.
Request status
- Currently pending requests
- Old Requests summary (circa Nov 2006)
New requests
Current Requests were either submitted using the cgi web interface before it had to be removed (September 2007) or via email to the embedding-hn hypernews list.
As such there is not a single source of information. The excel spreadsheet here is a summary of the known requests as of August 2008 (also pdf printout for those without access to excel). The co-ordinator should keep this updated.
Heavy flavour have also kept an up to date page with their extensive list of requests. See here.
[The spreadsheet was part of a presentation (ppt|pdf) in August 2008 on the state of embedding at the end of my tenure as EC - Lee Barnby]
In the near future we hope to have a dedicated drupal module available for entering embedding (and simulation) requests. This will have fields for all the required information and a work flow which will indicate the progress of the request. At the time of writing (August 2008) this is not available. A workable interim solution is to make requests using this page by selecting the 'Add Child Page' link. One should then write a page with the information for the request. Follow-up by members of the embedding team and requestors canthen use the 'Comment' facility on that page.
The following is required
This page is intended to provide details of embedding jobs currently in production.
ID | Date | Description | Status | Events | Notes |
1121704015 | Mon Jul 18 12:26:55 2005 | J/Psi Au+Au | Open | pgw Heavy | |
1127679226 | Fri Sep 16 20:34:17 2005 | Photon embedding for 62 GeV AuAu data for conversion analysis | Open | pgw High Pt | |
1126917257 | Sun Sep 25 16:13:46 2005 | AMPT full chain | Open | pgw EbyE | |
1130984157 | Wed Nov 2 21:15:57 2005 | pizero embedding in d+Au for EMCAL | Open | pgw HighPt | |
1138743134 | Tue Jan 31 16:32:14 2006 | dE/dx RUN4 @ high pT | Done | pgw Spectra | |
1139866250 | Mon Feb 13 16:30:50 2006 | Muon embedding RUN4 | Test Sample | pgw Spectra | |
1139868572 | Mon Feb 13 17:09:32 2006 | pi+/-, kaon+/-, proton/pbar embedding to RUN5 Cu+Cu 62 GeV data | open | pgw Spectra | |
1144865002 | Wed Apr 12 14:03:22 2006 | Lambda embedding for spectra (proton feeddown) | Open | pgw Strangeness | |
1146151888 | Thu Apr 27 11:31:28 2006 | pi,K,p 200 GeV Cu+Cu | Open | pgw Spectra | |
1146152319 | Thu Apr 27 11:38:39 2006 | K* for 62 GeV Cu+Cu | Open | pgw Spectra | |
1146673520 | Wed May 3 12:25:20 2006 | K* for 200 GeV Cu+Cu | Done | pgw Spectra | |
1148574792 | Thu May 25 12:33:12 2006 | Anti-alpha in 200 GeV AuAu | Closed | pgw Spectra | |
1148586109 | Thu May 25 15:41:49 2006 | He3 in 200GeV AuAu | Test Sample | pgw Spectra | |
1148586313 | Thu May 25 15:45:13 2006 | Deuteron in 200GeV AuAu | Done | pgw Spectra | |
1154003633 | Thu Jul 27 08:33:53 2006 | J/Psi embedding for pp2006 | Open | pgw Heavy | |
1154003721 | Thu Jul 27 08:35:21 2006 | Upsilon embedding for pp2006 | Open | pgw Heavy | |
1154003879 | Thu Jul 27 08:37:59 2006 | electron embedding for Cu+Cu 2005 | Test Sample | pgw Heavy | |
1154003931 | Thu Jul 27 08:38:51 2006 | pi0 embedding for Cu+Cu 2005 for heavy flavor group | open | pgw Heavy | |
1154003958 | Thu Jul 27 08:39:18 2006 | gamma embedding for Cu+Cu 2005 for heavy flavor group | open | pgw Heavy | |
1154004033 | Thu Jul 27 08:40:33 2006 | electron embedding for p+p 2005 for heavy flavor group (e-h correlations) | Test Sample | pgw Heavy | |
1154004074 | Thu Jul 27 08:41:14 2006 | pi0 embedding for p+p 2005 for heavy flavor group (e-h correlations) | Test Sample | pgw Heavy | |
1154626301 | Thu Aug 3 13:31:41 2006 | AntiXi Cu+Cu (P06ib) | Done | pgw Strangeness | |
1154626418 | Thu Aug 3 13:33:38 2006 | Xi Au+Au (P05ic) | Done | pgw Strangeness | |
1154626430 | Thu Aug 3 13:33:50 2006 | Omega Au+Au (P05ic) | Done | pgw Strangeness | |
1156254135 | Tue Aug 22 09:42:15 2006 | Phi in pp for spin-alignment | open | pgw Spectra | |
1163565625 | Tue Nov 14 23:40:25 2006 | muon CuCu 200 GeV | open | pgw Spectra | |
1163627909 | Wed Nov 15 16:58:29 2006 | muon CuCu 62 GeV | open | pgw Spectra | |
1163628205 | Wed Nov 15 17:03:25 2006 | phi CuCu 200 GeV | Test Sample | pgw Spectra | |
1163628539 | Wed Nov 15 17:08:59 2006 | K* pp 200 GeV (year 2005) | Open | pgw Spectra | |
1163628764 | Wed Nov 15 17:12:44 2006 | phi pp 200 GeV (year 2005) | Open | pgw Spectra |
Before submitting a new request, a double-check in the simulation request page is recommended, to see whether there are existing requests/data can be used. If not exist, one need to submit a new request. Please first read the 'Appendix A' in the embedding structure page for the rules to follow.
If the details of the embedding request has been thoroughly discussed within the PWG and hence approved by the PWG conveners. Please the PWG convener add a new request in the simulation request page and input all of the details there.
There are some key information that must be provided for each embedding request. (If you can not find some of the following items in the form, simply input it in the 'Simulation request description' box.)
Detailed information of the real data sample (to be embedded into).
vertex cut, and vertex selection method. for example, "|Vertex_z|<30cm", "Vr<2cm", "vertex is constrained by VPD vertex, |Vz-Vz_{VPD}|<3cm", "PicoVtxMode:PicoVtxVpdOrDefault, TpcVpdVzDiffCut:6" or "default highest ranked TPC vertex".
Details for simulation and reconstruction.
Finally, please think carefully about the number of events! The computing resources (i.e. the CPU cores and storage) are limited !
It is acceptable to modify the details of the request afterwards, although it will be of great help if all above detailed information can be provided when a new request is submitted, in order to avoid the time waste in communications. If this is inevitable, please notify the Embedding Coordinator immediately if the details of a request is modified, especially when the request is opened.
We start weekly embedding meeting at Tuesday 9am US eastern time, to discuss all embedding related topics.
The Zoom link can be found in below:
Topic: STAR weekly embedding meeting
Time: Tuesday 09:00 AM Eastern Time (US and Canada)
Join ZoomGov Meeting
https://bnl.zoomgov.com/j/1606384145?pwd=cFZrSGtqVXZ2a3ZNQkd1WTQvU1o0UT09
Meeting ID: 160 638 4145
Passcode: 597036
Meetings in 2024:
Passcode: hYA2MW=E
Agenda:1) Status and planning of embedding production
Passcode: hL+B?#5@
Agenda:1) Status and planning of embedding production
Passcode: i*g7YY3C
Agenda:1) Status and planning of embedding production
Passcode: h6gZ^Znt
Agenda:1) Status and planning of embedding production
Passcode: %F*9B1#.
Agenda:1) Status and planning of embedding production
Passcode: RDNr&V%0
Agenda:1) Status and planning of embedding production
Passcode: 0SN6F&E@
Agenda:1) Status and planning of embedding production
Passcode: ^9bMy0Fs
Agenda:1) Status and planning of embedding production
Passcode: tpV4ww^9
Agenda:1) Status and planning of embedding production
Passcode: C*K#.$2U
Agenda:1) Status and planning of embedding production
Passcode: P=.kN*94 (status around 15min~20minute)
Agenda:1) Status and planning of embedding production
Passcode: y@2H0^xV
Agenda:1) Status and planning of embedding production
Passcode: t9aA.3VJ
Agenda:1) Status and planning of embedding production
Passcode: kD=&m2E!
Agenda:1) Status and planning of embedding production
Passcode: @V*Ho39G
Agenda:1) Status and planning of embedding production -
Passcode: 2!bipSh&
Agenda:1) Status and planning of embedding production
Passcode: ^Z5u*E.h
Agenda:1) Status and planning of embedding production -Xionghong Maowu, Xianglei
2) QA plot and discussion on e+e- embedding at 9.2 GeV, Zhen et al
3) Discussion on run12 jet substructure embedding set up, -Isaac, Youqi et al
Passcode: nF%2^bxU
Agenda:1) Status and planning of embedding production - Xianglei, Pavel, Xionghong, Maowu
Passcode: F!wh0DTF
Agenda:1) Status and planning of embedding production - Xianglei
Passcode: 9K*C!?KZ
Agenda:1) Status and planning of embedding production - Xianglei
Passcode: KP*K5vNE
Agenda:1) Status and planning of embedding production - Xianglei
Passcode: EAKp1*z%
1) Welcome Pavel Kisel to join the embedding team as deputy -Qinghua
2) Status and planning of embedding production - Xianglei
Passcode: ^$d54Gqm
1) Status of embedding production - Xianglei
2) Discussion of embedding planning -all
Passcode: bBQ=8@cS
1) Status of embedding production - Xianglei
Passcode: WwLTH@@7
1) Status of embedding production - Xianglei
2) Discussion of embedding plan for coming conferences -all
Passcode: +Z&Yn&A2
Agenda: 1) Status and planning of embedding production - Xianglei
Passcode: 5c+D0fVa
Agenda: 1) Status and planning of embedding production - Xianglei et al
Passcode: %&d2nRb$
Agenda: 1) Status of embedding production-Xianglei; 2) Remaining embedding request & priority list- Sooraj and all
Passcode: %rGQ8w5%
Agenda: 1) Status and planning of embedding production - Xianglei
2) an example of base QA for embedding helpers -Yi Fang
Passcode: R7U@0z*D
1) Status of embedding production - Xianglei, all
2) Planning of remaining requests - all
Passcode: 50+kgTW*
Agenda: 1) Status of embedding (17.3 GeV production completed. ) Xianglei
2) embedding production planning
Passcode: y3?P.tPW
Passcode: 9EF4Y$ei
Agenda: 1) Status of embedding (found a DB tracking issue from Dec. 14, temporary patch applied. 17.3 GeV tuning underway. )
2) embedding production planning
*********************************Meeting in 2023***************************
Recording: https://bnl.zoomgov.com/rec/share/PXEyMAM1wTqIoYNEQO__Ex9X0Zv2YVZfVIO-vFu-RtwFiLmoeXDAYO4yzDJgmFX_.ozgESVwcNJmoONsQ
Passcode: vcq*?*60
Recording: https://bnl.zoomgov.com/rec/share/LAhvI5YxeVqNmlvoU_F_kCwX9J_FKo0g20BAiK3R8mvv7nV30uMzdbDZ5gLYzBVY.F2I4pteWl6On0yRu
Passcode: Sbi46+=k
Agenda: 1) Status of embedding production - Xianglei (parameters tuned and testing sample for 11.5 GeV produced)
2) Discussion on the priority list of embedding request - Sooraj, Xianglei et al
Passcode: @3Q1yze!
Meeting Nov. 28, 2023,
Recording:https://bnl.zoomgov.com/rec/share/DUfRTTgmVNmFbZsJ6M5RMoYvpTQcf7gq_obTy9vpysPd8KY2GIltJ7FVKe05I46k.V9NNoP7AzwWSY7CG
Passcode: 90j9!kJ6
Agenda:
1) Status of embedding production -Xianglei
-summary: Testing sample available for 9.2GeV (pass base QA), need analyzer to look at them before official production. Started with 11.5GeV.
Passcode: jZP#xm61
Agenda:
1) Status of embedding production at 9.2 GeV Au+Au collision -Xianglei+all
- Brief summary: A few remaining issue-T0 offset tuning, iTPC gain tuning, transverse diffusion. Testing sample will be available in a few more days for QA.
2) Embedding planning, all
Passcode: Gz^g9*J3
ID |
Task
|
% Complete | Duration | Start | Finish | Assigned people |
---|---|---|---|---|---|---|
1 |
General QA consolidation
|
28% | 109 days? | Thu 11/16/06 | Tue 4/17/07 | |
2 | ||||||
3 |
Documentation
|
25% | 37 days | Mon 12/18/06 | Tue 2/6/07 | |
4 |
Port old QA documentation to Drupal, define hierarchy
|
50% | 4 wks | Mon 12/18/06 | Fri 1/12/07 | Cristina Suarez[10%] |
5 |
Add general documentation descriptive of the embedding purpose
|
0% | 2 days | Mon 1/15/07 | Tue 1/16/07 | Lee Barnby[10%],Andrew Rose[10%] |
6 |
Add documentation as per the embedding procedure, diverse embedding
|
0% | 2 days | Mon 1/15/07 | Tue 1/16/07 | Lee Barnby[10%],Andrew Rose[10%] |
7 |
Import PDSF documentation into Drupal
|
0% | 1 wk | Wed 1/17/07 | Tue 1/23/07 | Andrew Rose[10%] |
8 |
Review and adjust documentation
|
0% | 1 wk | Wed 1/24/07 | Tue 1/30/07 | Olga Barranikova[10%],Andrew Rose[5%],Lee Barnby[5%] |
9 |
Deliver documentation to collaboration for comments
|
0% | 1 wk | Wed 1/31/07 | Tue 2/6/07 | STAR Collaboration[10%] |
10 |
Drop all old documentation, adjust link (redirect)
|
0% | 1 day | Wed 1/31/07 | Wed 1/31/07 | Andrew Rose[15%],Jerome Lauret[15%] |
11 | ||||||
12 |
Line of authority, base conventions
|
84% | 52 days | Thu 11/16/06 | Fri 1/26/07 | |
13 |
Meeting with key personnel
|
100% | 1 day | Thu 11/16/06 | Thu 11/16/06 | Jerome Lauret[10%],Olga Barranikova[10%],Andrew Rose[10%],Lee Barnby[10%] |
14 |
Define responsibilities and scope of diverse individual in the embedding team
|
100% | 1 mon | Mon 12/4/06 | Fri 12/29/06 | Jerome Lauret[15%],Olga Barranikova[6%] |
15 |
Define file name convention, document final proposal
|
50% | 2 wks | Mon 1/15/07 | Fri 1/26/07 | Jerome Lauret[6%],Lee Barnby[6%],Lidia Didenko[6%],Andrew Rose[6%] |
16 | ||||||
17 |
Collaborative work
|
45% | 60 days | Mon 1/22/07 | Fri 4/13/07 | |
18 |
General Cataloguing issues
|
0% | 9 days | Mon 1/29/07 | Thu 2/8/07 | |
19 |
Test Catalog registration, adjust as necessary
|
0% | 4 days | Mon 1/29/07 | Thu 2/1/07 | Lidia Didenko[20%],Jerome Lauret[20%] |
20 |
Extend Spider/Indexer to include embedding registration
|
0% | 1 wk | Fri 2/2/07 | Thu 2/8/07 | Jerome Lauret[10%] |
21 |
Bug tracking, mailing lists and other tools
|
69% | 60 days | Mon 1/22/07 | Fri 4/13/07 | |
22 |
Re-enable embedding list, establish focus comunication at PWG level and user level
|
75% | 3 mons | Mon 1/22/07 | Fri 4/13/07 | Jerome Lauret[10%] |
23 |
Establish embedding RT system queue
|
0% | 1 day | Tue 1/23/07 | Tue 1/23/07 | Jerome Lauret[5%] |
24 |
Exercise embedding RT queue, adjust requirement
|
0% | 4 days | Wed 1/24/07 | Mon 1/29/07 | Andrew Rose[10%] |
25 |
Establish data transfer scheme to a BNL disk pool
|
0% | 22 days | Mon 1/22/07 | Tue 2/20/07 | |
26 |
Define requirements, general problems and issues
|
0% | 1 wk | Mon 1/22/07 | Fri 1/26/07 | |
27 |
Add data pool mechanism at BNL, transfer with any method
|
0% | 1 wk | Mon 1/29/07 | Fri 2/2/07 | |
28 |
Establish security schem, HPSS auto-synching
|
0% | 1 wk | Mon 2/5/07 | Fri 2/9/07 | |
29 |
Test on one or more sites (non-PDSF)
|
0% | 1 wk | Mon 2/12/07 | Fri 2/16/07 | |
30 |
Integrate to all participating sites
|
0% | 1 wk | Mon 2/12/07 | Fri 2/16/07 | |
31 |
Document data transfer schemeand procedure
|
0% | 2 days | Mon 2/19/07 | Tue 2/20/07 | |
32 | ||||||
33 |
CVS check-in and cleanup
|
4% | 17 days? | Mon 1/22/07 | Tue 2/13/07 | |
34 |
Initial setup, existing framework
|
0% | 17 days | Mon 1/22/07 | Tue 2/13/07 | |
35 |
Define proper CVS location for perl, libs, macros
|
0% | 1 day | Mon 1/22/07 | Mon 1/22/07 | Jerome Lauret[10%],Andrew Rose[10%],Lee Barnby[10%] |
36 |
Add existing QA macros to CVS
|
0% | 1 day | Tue 1/23/07 | Tue 1/23/07 | Andrew Rose[20%] |
37 |
Checkout and test on +1 site (non-PDSF), adjust as necessary
|
0% | 1 wk | Wed 1/24/07 | Tue 1/30/07 | Lee Barnby[10%] |
38 |
Bootstrap on +1 site / remove ALL site specifics references
|
0% | 1 wk | Wed 1/31/07 | Tue 2/6/07 | Cristina Suarez[10%] |
39 |
Commit to CVS, verify new scripts on all sites, final adjustments
|
0% | 1 wk | Wed 2/7/07 | Tue 2/13/07 | Cristina Suarez[10%],Andrew Rose[10%],Lee Barnby[10%] |
40 |
QA and nightly tests
|
17% | 7 days? | Mon 1/22/07 | Tue 1/30/07 | |
41 |
Establish a QA area in CVS
|
100% | 1 day? | Mon 1/22/07 | Mon 1/22/07 | |
42 |
Check existing QA suite
|
0% | 1 wk | Wed 1/24/07 | Tue 1/30/07 | |
43 | ||||||
44 |
Development
|
0% | 62 days | Mon 1/22/07 | Tue 4/17/07 | |
45 |
General QA consolidation
|
0% | 10 days | Wed 1/31/07 | Tue 2/13/07 | |
46 |
Gather feedback from PWG, add QA tests relevant to Physics topics
|
0% | 2 wks | Wed 1/31/07 | Tue 2/13/07 | |
47 |
Establish nightly test framework at BNL for embedding
|
0% | 1 wk | Wed 1/31/07 | Tue 2/6/07 | |
48 |
General improvements
|
0% | 35 days | Mon 1/22/07 | Fri 3/9/07 | |
49 |
Requirements study for an embedding request interface
|
0% | 2 wks | Mon 1/29/07 | Fri 2/9/07 | Andrew Rose[10%],Jerome Lauret[10%] |
50 |
Develop new embedding request form compatible with Drupal module
|
0% | 4 wks | Mon 2/12/07 | Fri 3/9/07 | Andrew Rose[10%] |
51 |
Test new interface, import old tasks (historical purposes)
|
0% | 5 days | Mon 1/22/07 | Fri 1/26/07 | Andrew Rose[10%],Cristina Suarez[10%] |
52 |
Distributed Computing
|
0% | 20 days | Wed 2/14/07 | Tue 3/13/07 | |
53 |
Use SUMS framework to submit embedding, establish first XML
|
0% | 1 wk | Wed 2/14/07 | Tue 2/20/07 | Lee Barnby[10%] |
54 |
Test on one site
|
0% | 1 wk | Wed 2/21/07 | Tue 2/27/07 | Lee Barnby[10%] |
55 |
Test on all sites, adjust as necessary
|
0% | 2 wks | Wed 2/28/07 | Tue 3/13/07 | Cristina Suarez[10%],Andrew Rose[10%],Lee Barnby[10%] |
56 |
Gridfication
|
0% | 25 days | Wed 3/14/07 | Tue 4/17/07 | |
57 |
Test XML using Grid policy (one site)
|
0% | 1 wk | Wed 3/14/07 | Tue 3/20/07 | |
58 |
Establish test of data transfer method, GSI enabled HPSS access possible
|
0% | 1 wk | Wed 3/21/07 | Tue 3/27/07 | |
59 |
Regression and stress test on one site
|
0% | 1 wk | Wed 3/28/07 | Tue 4/3/07 | |
60 |
Test on +1 site, infrastructure consolidation
|
0% | 2 wks | Wed 4/4/07 | Tue 4/17/07 | |
61 | ||||||
62 |
Embedding operation
|
25% | 261 days? | Mon 1/1/07 | Mon 12/31/07 | |
63 |
PDSF support
|
50% | 261 days? | Mon 1/1/07 | Mon 12/31/07 | Andrew Rose[10%] |
64 |
BHAM support
|
10% | 261 days? | Mon 1/1/07 | Mon 12/31/07 | Lee Barnby[10%] |
65 |
UIC Support including QA
|
15% | 261 days? | Mon 1/1/07 | Mon 12/31/07 | Olga Barranikova[5%],Cristina Suarez[10%] |
18 April 2007 12:21:56
Talked to Yuri on Monday (16th)
He would like 3 things worked on.
1. Integration of MC generation part into bfcMixer.C
Basically all kumac commands can be done in macro using gstar.
These would become part of "chain one"
Also need to read in a tag or MuDst file to find vertex to use for generating particles.
Can probably see how this works from bfc.C itself as bfc.C(1) creates particles and runs them through reconstruction.
- actually I could not it is inside bfc.C or StBFChain because it is part of St_geant_Maker
2. Change Hit Mover so that it does not move hits derived from MC info (based on ID truth %age)
3. [I forgot what 3 was!]
Rough sketch of chain modifications for #1
Current bfcMixer
(StChain)Chain
(StBFChain)daqChain<--daq file
(StBFChain simChain<--fz file
<---.dat file with vertex positions
MixerMaker
(StBFChain)recoChain
New bfcMixer
(StChain)Chain
(
StBFChain)daqChain<--daq file
(StBFChain)simChain
|
Geant-?-SetGeantMaker<--tags file
MixerMaker
(StBFChain)recoChain
Break down into sub-tasks.
a) Run bfcMixer.C on a daq file with an associated fz and data file (to check that it works!)
b) Ignore fz file and generate MC particle (any!) on the fly
c) reading from tags file generate MC particles at desired vertex & with desired mult.
d) tidy up specify parameter interface (p distn, geant ID etc.)
Current Embedding Coordinator (EC): Xianglei Zhu (zhux@rcf.rhic.bnl.gov)
Current Embedding Deputy (ED):
Current PDSF Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov)
<!-- Generated by StRoot/macros/embedding/get_embedding_xml.pl on Mon Aug 2 15:26:13 PDT 2010 --> <?xml version="1.0" encoding="utf-8"?> <job maxFilesPerProcess="1" fileListSyntax="paths"> <command> <!-- Load library --> starver SL07e <!-- Set tags file directory --> setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id <!-- Set year and day from filename --> setenv EMYEAR `StRoot/macros/embedding/getYearDayFromFile.pl -y ${FILEBASENAME}` setenv EMDAY `StRoot/macros/embedding/getYearDayFromFile.pl -d ${FILEBASENAME}` <!-- Set log files area --> setenv EMLOGS /project/projectdirs/star/embedding <!-- Set HPSS outputs/LOG path --> setenv EMHPSS /nersc/projects/starofl/embedding/ppProductionJPsi/JPsi_&FSET;_20100601/P06id.SL07e/${EMYEAR}/${EMDAY} <!-- Print out EMYEAR and EMDAY and EMLOGS --> echo EMYEAR : $EMYEAR echo EMDAY : $EMDAY echo EMLOGS : $EMLOGS echo EMHPSS : $EMHPSS <!-- Start job --> echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...' root4star -b <<EOF std::vector<Int_t> triggers; triggers.push_back(117705); triggers.push_back(137705); triggers.push_back(117701); .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); .q EOF ls -la . cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog <!-- New command to organize log files --> mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET; mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/ <!-- Archive in HPSS --> hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog" </command> <!-- Define locations of log/elog files --> <stdout URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.log"/> <stderr URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.elog"/> <!-- Input daq files --> <input URL="file:/eliza3/starprod/daq/2006/st*"/> <!-- csh/list files --> <Generator> <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location> </Generator> <!-- Put any locally-compiled stuffs into a sand-box --> <SandBox installer="ZIP"> <Package name="Localmakerlibs"> <File>file:./.sl44_gcc346/</File> <File>file:./StRoot/</File> <File>file:./pams/</File> </Package> </SandBox> </job>
<!-- Input daq files --> <input URL="file:/eliza3/starprod/daq/2006/st*"/>
<!-- Set tags file directory --> setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id
> StRoot/macros/embedding/get_embedding_xml.pl -daq /eliza3/starprod/daq/2006 -tag /eliza3/starprod/tags/ppProductionJPsi/P06id
> StRoot/macros/embedding/get_embedding_xml.pl -tag /eliza3/starprod/tags/ppProductionJPsi/P06id -daq /eliza3/starprod/daq/2006
Below is the descriptions to run the job (bfcMixer), save log files, put outputs/logs into HPSS.
<!-- Start job --> echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...' root4star -b <<EOF std::vector<Int_t> triggers; triggers.push_back(117705); triggers.push_back(137705); triggers.push_back(117701); .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); .q EOF ls -la . cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog <!-- New command to organize log files --> mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET; mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/ <!-- Archive in HPSS --> hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"
> StRoot/macros/embedding/get_embedding_xml.pl -mixer StRoot/macros/embedding/bfcMixer_Tpx.C
<= Run4 : bfcMixer_TpcOnly.C
Run5 - Run7 : bfcMixer_TpcSvtSsd.C
>= Run8 : bfcMixer_Tpx.C
> StRoot/macros/embedding/get_embedding_xml.pl -production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi
<!-- Load library --> starver SL07e ... ... ls -la . cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog <!-- New command to organize log files --> mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET; mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/ <!-- Archive in HPSS --> hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog" ... ... <!-- csh/list files --> <Generator> <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location> </Generator>
Error: No /project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST exists. Stop. Make sure you've put the correct path for generator file.
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/
Particle Jpsi code=160 TrkTyp=4 mass=3.096 charge=0 tlife=7.48e-21,pdg=443 bratio= { 1, } mode= { 203, }
> StRoot/macros/embedding/get_embedding_xml.pl -geantid 160 -particle JPsi
> StRoot/macros/embedding/get_embedding_xml.pl -mode Strange
> StRoot/macros/embedding/get_embedding_xml.pl -mult 0.05
> StRoot/macros/embedding/get_embedding_xml.pl -zmin -30.0 -zmax 30.0
> StRoot/macros/embedding/get_embedding_xml.pl -ymin -1.0 -ymax 1.0 -ptmin 0.0 -ptmax 6.0
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705
> StRoot/macros/embedding/get_embedding_xml.pl -trggier 117705 -trigger 137705 -trigger 117001
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705 137705 ...
StRoot/macros/embedding/get_embedding_xml.pl -prodname P06idpp
> StRoot/macros/embedding/get_embedding_xml.pl -f -daq /eliza3/starprod/daq/2006 -tag /eliza3/starprod/tags/ppProductionJPsi/P06id \ -production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi -geantid 160 -particle JPsi -ptmax 6.0 -trigger 117705 -trigger 137705 -trigger 117701 \ -prodname P06idpp
--------------------------------
P07ic CuCu production: TString prodP07icAuAu("P2005b DbV20070518 MakeEvent ITTF ToF ssddat spt SsdIt SvtIt pmdRaw OGridLeak OShortR OSpaceZ2 KeepSvtHit skip1row VFMCE -VFMinuit -hitfilt");
P08ic AuAu production: DbV20080418 B2007g ITTF adcOnly IAna KeepSvtHit VFMCE -hitfilt l3onl emcDY2 fpd ftpc trgd ZDCvtx svtIT ssdIT Corr5 -dstout
If spacecharge and gridleak corrections are on average instead of event by event then Corr5-> Corr4, OGridLeak3D, OSpaceZ2.
P08ie dAu production : DbV20090213 P2008 ITTF OSpaceZ2 OGridLeak3D beamLine, VFMCE TpcClu -VFMinuit -hitfilt
TString chain20pt("NoInput,PrepEmbed,gen_T,geomT,sim_T,trs,-ittf,-tpc_daq,nodefault);
P06id pp production : TString prodP06idpp("DbV20060729 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt");
P06ie pp production : TString prodP06iepp("DbV20060915 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7096005-7156040
TString prodP06iepp("DbV20061021 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7071001-709402
/nersc/projects/starofl/embedding/${TRGSETUPNAME}/${PARTICLE}_&FSET;_${REAUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}/${EMDAY}
(starofl home) /home/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR} (HPSS) /nersc/projects/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}
Current Embedding Coordinator (EC): Xianglei Zhu (zhux@tsinghua.edu.cn)
Current Embedding Deputy (ED): Derek Anderson (derekwigwam9@tamu.edu)
Current NERSC Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov) and Jan Balewski (balewski@lbl.gov)
<!-- Generated by StRoot/macros/embedding/get_embedding_xml.pl on Mon Aug 2 15:26:13 PDT 2010 --> <?xml version="1.0" encoding="utf-8"?> <job maxFilesPerProcess="1" fileListSyntax="paths"> <command> <!-- Load library --> starver SL07e <!-- Set tags file directory --> setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id <!-- Set year and day from filename --> setenv EMYEAR `StRoot/macros/embedding/getYearDayFromFile.pl -y ${FILEBASENAME}` setenv EMDAY `StRoot/macros/embedding/getYearDayFromFile.pl -d ${FILEBASENAME}` <!-- Set log files area --> setenv EMLOGS /project/projectdirs/star/embedding <!-- Set HPSS outputs/LOG path --> setenv EMHPSS /nersc/projects/starofl/embedding/ppProductionJPsi/JPsi_&FSET;_20100601/P06id.SL07e/${EMYEAR}/${EMDAY} <!-- Print out EMYEAR and EMDAY and EMLOGS --> echo EMYEAR : $EMYEAR echo EMDAY : $EMDAY echo EMLOGS : $EMLOGS echo EMHPSS : $EMHPSS <!-- Start job --> echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...' root4star -b <<EOF std::vector<Int_t> triggers; triggers.push_back(117705); triggers.push_back(137705); triggers.push_back(117701); .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); .q EOF ls -la . cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog <!-- New command to organize log files --> mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET; mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/ <!-- Archive in HPSS --> hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog" </command> <!-- Define locations of log/elog files --> <stdout URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.log"/> <stderr URL="file:/project/projectdirs/star/embedding/P06id/LOG/$JOBID.elog"/> <!-- Input daq files --> <input URL="file:/eliza3/starprod/daq/2006/st*"/> <!-- csh/list files --> <Generator> <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location> </Generator> <!-- Put any locally-compiled stuffs into a sand-box --> <SandBox installer="ZIP"> <Package name="Localmakerlibs"> <File>file:./.sl44_gcc346/</File> <File>file:./StRoot/</File> <File>file:./pams/</File> </Package> </SandBox> </job>
<!-- Input daq files --> <input URL="file:/eliza3/starprod/daq/2006/st*"/>
<!-- Set tags file directory --> setenv EMBEDTAGDIR /eliza3/starprod/tags/ppProductionJPsi/P06id
> StRoot/macros/embedding/get_embedding_xml.pl -daq /eliza3/starprod/daq/2006 -tag /eliza3/starprod/tags/ppProductionJPsi/P06id
> StRoot/macros/embedding/get_embedding_xml.pl -tag /eliza3/starprod/tags/ppProductionJPsi/P06id -daq /eliza3/starprod/daq/2006
Below is the descriptions to run the job (bfcMixer), save log files, put outputs/logs into HPSS.
<!-- Start job --> echo 'Executing bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); ...' root4star -b <<EOF std::vector<Int_t> triggers; triggers.push_back(117705); triggers.push_back(137705); triggers.push_back(117701); .L StRoot/macros/embedding/bfcMixer_TpcSvtSsd.C bfcMixer_TpcSvtSsd(1000, 1, 1, "$INPUTFILE0", "$EMBEDTAGDIR/${FILEBASENAME}.tags.root", 0, 6.0, -1.5, 1.5, -200, 200, 160, 1, triggers, "P08ic", "FlatPt"); .q EOF ls -la . cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog <!-- New command to organize log files --> mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET; mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/ <!-- Archive in HPSS --> hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog"
> StRoot/macros/embedding/get_embedding_xml.pl -mixer StRoot/macros/embedding/bfcMixer_Tpx.C
<= Run4 : bfcMixer_TpcOnly.C
Run5 - Run7 : bfcMixer_TpcSvtSsd.C
>= Run8 : bfcMixer_Tpx.C
> StRoot/macros/embedding/get_embedding_xml.pl -production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi
<!-- Load library --> starver SL07e ... ... ls -la . cp $EMLOGS/P06id/LOG/$JOBID.log ${FILEBASENAME}.$JOBID.log cp $EMLOGS/P06id/LOG/$JOBID.elog ${FILEBASENAME}.$JOBID.elog <!-- New command to organize log files --> mkdir -p $EMLOGS/P06id/JPsi_20100601/LOG/&FSET; mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/ <!-- Archive in HPSS --> hsi "mkdir -p $EMHPSS; prompt; cd $EMHPSS; mput *.root; mput ${FILEBASENAME}.$JOBID.log; mput ${FILEBASENAME}.$JOBID.elog" ... ... <!-- csh/list files --> <Generator> <Location>/project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST</Location> </Generator>
Error: No /project/projectdirs/star/embedding/P06id/JPsi_20100601/LIST exists. Stop. Make sure you've put the correct path for generator file.
mv $EMLOGS/P06id/LOG/$JOBID.* $EMLOGS/P06id/JPsi_20100601/LOG/&FSET;/
Particle Jpsi code=160 TrkTyp=4 mass=3.096 charge=0 tlife=7.48e-21,pdg=443 bratio= { 1, } mode= { 203, }
> StRoot/macros/embedding/get_embedding_xml.pl -geantid 160 -particle JPsi
> StRoot/macros/embedding/get_embedding_xml.pl -mode Strange
> StRoot/macros/embedding/get_embedding_xml.pl -mult 0.05
> StRoot/macros/embedding/get_embedding_xml.pl -zmin -30.0 -zmax 30.0
> StRoot/macros/embedding/get_embedding_xml.pl -ymin -1.0 -ymax 1.0 -ptmin 0.0 -ptmax 6.0
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705
> StRoot/macros/embedding/get_embedding_xml.pl -trggier 117705 -trigger 137705 -trigger 117001
> StRoot/macros/embedding/get_embedding_xml.pl -trigger 117705 137705 ...
StRoot/macros/embedding/get_embedding_xml.pl -prodname P06idpp
> StRoot/macros/embedding/get_embedding_xml.pl -f -daq /eliza3/starprod/daq/2006 -tag /eliza3/starprod/tags/ppProductionJPsi/P06id \ -production P06id -lib SL07e -r 20100601 -trg ppProductionJPsi -geantid 160 -particle JPsi -ptmax 6.0 -trigger 117705 -trigger 137705 -trigger 117701 \ -prodname P06idpp
--------------------------------
P07ic CuCu production: TString prodP07icAuAu("P2005b DbV20070518 MakeEvent ITTF ToF ssddat spt SsdIt SvtIt pmdRaw OGridLeak OShortR OSpaceZ2 KeepSvtHit skip1row VFMCE -VFMinuit -hitfilt");
P08ic AuAu production: DbV20080418 B2007g ITTF adcOnly IAna KeepSvtHit VFMCE -hitfilt l3onl emcDY2 fpd ftpc trgd ZDCvtx svtIT ssdIT Corr5 -dstout
If spacecharge and gridleak corrections are on average instead of event by event then Corr5-> Corr4, OGridLeak3D, OSpaceZ2.
P08ie dAu production : DbV20090213 P2008 ITTF OSpaceZ2 OGridLeak3D beamLine, VFMCE TpcClu -VFMinuit -hitfilt
TString chain20pt("NoInput,PrepEmbed,gen_T,geomT,sim_T,trs,-ittf,-tpc_daq,nodefault);
P06id pp production : TString prodP06idpp("DbV20060729 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt");
P06ie pp production : TString prodP06iepp("DbV20060915 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7096005-7156040
TString prodP06iepp("DbV20061021 pp2006b ITTF OSpaceZ2 OGridLeak3D VFMCE -VFPPVnoCTB -hitfilt"); run# 7071001-709402
/nersc/projects/starofl/embedding/${TRGSETUPNAME}/${PARTICLE}_&FSET;_${REAUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}/${EMDAY}
(starofl home) /home/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR} (HPSS) /nersc/projects/starofl/embedding/CODE/${TRGSETUPNAME}/${PARTICLE}_${REQUESTID}/${PRODUCTION}.${LIBRARY}/${EMYEAR}
This instructions provide for embedding helpers how to prepare/submit the embedding jobs at PDSF
NOTE: This is specific instructions at PDSF, some procedures may not work at RCF
Current Embedding Coordinator (EC): Terence Tarnowsky (tarnowsk@nscl.msu.edu)
Current PDSF Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov)
Contents
Please have a look at the "common issue: memory limit in batch"
and follow the procedure how to increase the memory limit in the batch jobs.
All EH should make this change before submitting any embedding production jobs.
Below is the copy from the link above what you need to do in order to get afs token to access CVS
> klog -cell rhic -principal YourRCFUserName
> cvs co StRoot/macros/embedding > cvs co StRoot/St_geant_Maker
> mv StRoot/St_geant_Maker/Embed/StPrepEmbedMaker* StRoot/St_geant_Maker/ > starver ${library} > cons
> cvs co pams/sim/gstar > cons
Please contact EC or ED whether bfcMixer (either bfcMixer_TpcSvtSsd.C or bfcMixer_Tpx.C) is ready to submit or not, and confirm which bfcMixer should be used for the current request.
<!-- Put any locally-compiled stuffs into a sand-box --> <SandBox installer="ZIP"> <Package name="Localmakerlibs"> <File>file:./.sl44_gcc346/</File> <File>file:./StRoot/</File> <File>file:./pams/</File> </Package> </SandBox>
in your xml file. If you have anything other than above codes, please include them.
Please contact EC or ED if you are not clear enough which codes you need to include.
> star-submit-template -template embed_template.xml -entities FSET=200
4-3. Re-submitting jobs
Sometime you may need to modify something under "StRoot" or "pams", and recompile to fix some problems.
Each time you recompiled your local codes, you should clean up the current "Localmakerlibs.zip" and
"Localmakerlibs.package/" before starting resubmission. If you forgot to clean up the older "Localmakerlibs",
then the modification in the local codes will not reflect in the resubmitting jobs.
Grab a set of daq files from RCF which cover the lifetime of the run, the luminosity range experienced, and the conditions for the production. |
bfc.C macros are located under ~starofl/bfc. Edit the submit.[Production] script to point to the daq files loaded (as above). |
The results of the previous jobs will be .tags.root files located on HPSS. Retrieve the files, set a pointer for the tags files in the Production-specific directory under ~starofl/embedding. |
mkdir embedding
cd embedding
mkdir Common
mkdir Common/lists
mkdir Common/csh
mkdir GSTAR
mkdir P06ib
mkdir P06ib/setup
cd /u/user/embedding
cp /u/starofl/embedding/getVerticesFromTags_v4.C .
cp -R /u/starofl/embedding/P06ib/EmbeddingLib_v4_noFTPC/ P06ib/
cp /u/starofl/embedding/P06ib/Embedding_sge_noFTPC.pl P06ib/
cp /u/starofl/embedding/P06ib/bfcMixer_v4_noFTPC.C P06ib/
cp /u/starofl/embedding/P06ib/submit.starofl.pl P06ib/submit.user.pl
cp /u/starofl/embedding/P06ib/setup/Piminus_101_spectra.setup P06ib/setup/
cp /u/starofl/embedding/GSTAR/phasespace_P06ib_revfullfield.kumac GSTAR/
cp /u/starofl/embedding/GSTAR/phasespace_P06ib_fullfield.kumac GSTAR/
cp /u/starofl/embedding/Common/submit_sge.pl Common/
You now have all the files need to run embedding. There are further links to make but as you are going to export them to your own cluster you need to make the links afterwards.
Alternatively you can run embedding on PDSF from your home directory. There are a number of change to make first though because the various perl scripts have some paths relating to the starofl account inside them.
For those planning to export to a remote site you should tar and/or scp the data. I would recommend tar so that you can have the original package preserved in case something goes wrong. E.g.
tar -cvf embedding.tar embedding/
scp embedding.tar remoteuser@mycluster.blah.blah:/home/remoteuser
Obviously this step is unnecessary if you intend to run from your PDSF account although you may still want to create a tar file so that you can undo any changes which are wrong.
Login to your remote cluster and extract the archive. E.gcd /home/remoteuser
tar -xvf embedding.tar
The most obvious thing you will find are a number of places inside the perl scripts where the path or location for other scripts appears in the code. These must be changed accordingly.
changes to e.g.
changes to e.g.
changes to e.g.
changes to e.g.
changes to e.g.
changes to e.g.
changes to e.g.
/dante3/starprod/daq/2005/cuProductionMinBias/FullFieldwhereas on Bham cluster it is
/star/data1/daq/2005/cuProductionMinBias/FullFieldand thus the pattern match in perl has to change in order to extract the same information. If you have a choice then choose your directory names with care!
changes to e.g.
changes to e.g.
-qoption provides the name of the queue to use, otherwise it uses the default which I did not want in this case. The other extra options are to make the environment and working diretory correct as they were not the default for us. This is very specific to each cluster. If your cluster does not have SGE then I imagine extensive changes to the part writing the job submission script would be necessary. The scripts use the ability of SGE to have job arrays of similar jobs so you would have to emulate that somehow.
chain3->SetFlags
line actually sets the same flags since Andrew and I had to change the same flags e.g. add GeantOut option after I made orginal copy
and
. This is also something that Andrew and I both changed after I made the original copy.
line!daq_dir_2005_cuPMBFF -> /dante3/starprod/daq/2005/cuProductionMinBias/FullField
daq_dir_2005_cuPMBRFF -> /dante3/starprod/daq/2005/cuProductionMinBias/ReversedFullField
daq_dir_2005_cuPMBHTFF -> /eliza5/starprod/daq/2005/cucuProductionHT/FullField/
daq_dir_2005_cuPMBHTRFF -> /eliza5/starprod/daq/2005/cucuProductionHT/ReversedFullField
tags_dir_cu_2005 -> /dante3/starprod/tags/P06ib/2005
tags_dir_cuHT_2005 -> /eliza5/starprod/embedding/tags/P06ib
data -> /eliza12/starprod/embedding/data
lists ->../Common/lists
csh-> ../Common/csh
LOG-> ../Common/LOG
That is it! Some things will probably need to be adapted to your circumstances but it should give you a good idea of what to do
Author: Lee Barnby, University of Birmingham (using starembed account)
Modified: A. Rose, Lawrence Berkeley National Laboratory (using starembed account)
~starofl/embedding/[Production]/setup/[Particle]_[set]_[ID].setup
where
[Particle] is the particle type submitted (Piminus for GEANTID=9, as set inside file)
[set] is the file set submitted (more on this later)
[ID] is the embedding request number
Current Embedding Coordinator (EC): Xianglei Zhu (zhux@tsinghua.edu.cn)
Current NERSC Point Of Contact (POC): Jeff Porter (RJPorter@lbl.gov) and Jan Balewski (balewski@lbl.gov)
> cvs co StRoot/StMiniMcEvent > cvs co StRoot/StMiniMcMaker > cons
.sl53_gcc432/obj/StRoot/StMiniMcMaker/StMiniMcMaker.cxx: In member function 'void StMiniMcMaker::fillRcTrackInfo(StTinyRcTrack*, const StTrack*, const StTrack*, Int_t)': .sl53_gcc432/obj/StRoot/StMiniMcMaker/StMiniMcMaker.cxx:1622: error: 'const class StTrack' has no member named 'seedQuality'
> cp /eliza8/rnc/hmasui/embedding/QA/StMiniHijing.C ${work}
159 TString filename = MainFile; 160 // int fileBeginIndex = filename.Index(filePrefix,0); 161 // filename.Remove(0,fileBeginIndex); 162 filename.Remove(0, filename.Last('/')+1);
> cvs co StRoot/StAssociationMaker > cons
> root4star -b -q StMiniHijing.C'(1000, "/eliza9/starprod/embedding/P08ie/dAu/Piplus_201_1233091546/Piplus_st_physics_adc_9020060_raw_2060010_201/st_physics_adc_9020060_raw_2060010.geant.root", "./")'
or
> root4star -b [0] .L StMiniHijing.C [1] StMiniHijing(1000, "/eliza9/starprod/embedding/P08ie/dAu/Piplus_201_1233091546/Piplus_st_physics_adc_9020060_raw_2060010_201/st_physics_adc_9020060_raw_2060010.geant.root", "./"); .... .... .... [2].q
> root4star st_physics_adc_9020060_raw_2060010.minimc.root [0] StMiniMcTree->Draw("mMcTracks.mGeantId")
[0] StMiniMcTree->Scan("mMcTracks.mGeantId") [1].q
QA macro
> cvs checkout StRoot/macros/embedding/doEmbeddingQAMaker.C
QA codes
StEmbeddingUtilities > cvs checkout StRoot/
Either
> root4star -b -q doEmbeddingQAMaker.C'(2008, "P08ie", "minimc.list", "embedding.root")'
or
> root4star -b [0] .L doEmbeddingQAMaker.C [1] doEmbeddingQAMaker(2008, "P08ie", "minimc.list", "embedding.root"); ... ... ... [2] .q
The details of arguments can be found in the "doEmbeddingQAMaker.C"
> roo4star -b[0] .L doEmbeddingQAMaker.C [1] doEmbeddingQAMaker(2008, "P08ie", "minimc.list", "", kTRUE, 60.0); ... ... ... [2] .q
where the 5th argument is the switch to analyze embedding (kTRUE) or real data (kFALSE).
> roo4star -b[0] .L doEmbeddingQAMaker.C [1] doEmbeddingQAMaker(2008, "P08ie", "mudst.list", "", kFALSE); ... ... ... [2] .q
- The trigger id can be also selected by
StEmbeddingQAUtilities::addTriggerIdCut(const UInt_t id)
StEmbeddingQAUtilities accept multiple trigger id's while current code assumes 1 trigger id per event,
The trigger id cut only affects the real data, not for the embedding outputs.
- You can also apply rapidity cut by
StEmbeddingQAUtilities::setRapidityCut(const Double_t ycut)
It would be good to have the same rapidity cut in the real data as the embedding production.
Please have a look at the simulation request page for rapidity cut or ask embedding helpers
what rapidity cuts they used for the productions.
You can make the QA plots by "drawEmbeddingQA.C" under "StRoot/macros/embedding".
> cvs checkout StRoot/macros/embedding/drawEmbeddingQA.C
> root4star -l drawEmbeddingQA.C'("./", "qa_embedding_2007_P08ic.root", "qa_real_2007_P08ic.root", 8, 10.0)'
First argument is the directory where the output PDF file is printed.
The default output directory is the current directory.
You can now check the QA histograms from embedding outputs only by
> root4star -l drawEmbeddingQA.C'("./", "qa_embedding_2007_P08ic.root", "qa_real_2007_P08ic.root", 8, 10.0, kTRUE)'
where the last argument 'isEmbeddingOnly' (default is kFALSE) is the switch
to draw the QA histograms for embedding outputs only if it is true.
> root4star -l drawEmbeddingQA.C'("./", "qa_embedding.root", "qa_real.root", 2005, "P08ic", 8, 10.0, kFALSE)'
> root4star -l drawEmbeddingQA.C'("./", "qa_embedding_2007_P08ic.root", "qa_real_2007_P08ic.root", 8, 10, kFALSE, 37)'
maker->setParentGeantId(parentGeantId) ;
------------------------------------------------------------------------------------
This document is intended to describe the macros used during the quality assurance(Q/A) studies. This page is being updated today April 19 2009
* Macro : scan_embed_mc.C
After knowing the location of the minimc.root files use this macro to generate and output files with extension .root, in which all the histogramas for a basic QA had been filled. New histogramas had been added, ofr instacne a 3D histogram for Dca (pt, eta, dca) will give the distribution of dca as a function of pt and eta simultaneously. Same is done for the number of fit points (pr, eta, nfit). Also histograms to evaluate the randomness of the embedding files had been added to this macro.
* Macros: scan_embed_mudst.C
This macro hopefully you won't have to use it unless is requested. This macro is meant to generate and output root file with distributions coming from the MuDst (MuDst from Lidia) for a particular production. You will need just the location of the output file.
* Macro : plot_embed.C
This macro will take both outputs ( the one coming from minimc and that one coming from MuDst) and plot all the basic qa distributions for a particular production.
0.20<pT<0.30 0.30<pT<0.40 0.40<pT<0.50 0.50<pT<0.60 0.60<pT<0.70 0.70<pT<0.80 0.80<pT<0.90 0.90<pT<1.0
0.20<pT<0.30 0.30<pT<0.40 0.40<pT<0.50 0.50<pT<0.60 0.60<pT<0.70 0.70<pT<0.80 0.80<pT<0.90 0.90<pT<1.0
0.20<pT<0.30 0.30<pT<0.40 0.40<pT<0.50 0.50<pT<0.60 0.60<pT<0.70 0.70<pT<0.80 0.80<pT<0.90 0.90<pT<1.0
0.20<pT<0.30 0.30<pT<0.40 0.40<pT<0.50 0.50<pT<0.60 0.60<pT<0.70 0.70<pT<0.80 0.80<pT<0.90 0.90<pT<1.0
0.40<pT<0.50 0.50<pT<0.60 0.80<pT<0.90 0.90<pT<1.00
0.40<pT<0.50 0.50<pT<0.60 0.80<pT<0.90 0.90 < pT < 1.00
| | ||
0.20 < pT < 0.30 | 0.30 < pT < 0.40 | 0.40 < pT < 0.50 | 0.50 < pT < 0.60 |
QA P06ib (Phi->K +K)
This is the QA for P06ib (Phi- > KK). reconstruction on Global Tracks (Kaons)
1. dEdx
Reconstruction on Kaon Daugthers. Plot shows MOntecarlo tracks and Ghost Tracsks.
2. DCA Distributions
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots. (MonteCarlo and MuDst) (MuDst taken from pdsf > /eliza12/starprod/reco/cuProductionMinBias/ReversedFullField/P06ib/2005/022/st_physics_adc_6022048_raw*.MuDst.root)
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
3. NFit Distributions
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions. (MonteCarlo and MuDst)
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
4. Delta Vertex
The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z) (Cuts of vz =30 cm , NFitCut= 25 are applied)
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
7. Pt
Embedded Phi meson with flat pt (black)and Reconstructed Kaon Daugther (red).
8. Randomness Plots
The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.
QA P06ib (Rho->pi+pi)
This is the QA for P06ib (Rho- > pi+pi). reconstruction on Global Tracks (pions)
1. dEdx
Reconstruction on Pion Daugthers.
2. DCA Distributions
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
2b. Compared with MuDst
3. NFit Distributions
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
3b. Reconstructed compared with MuDsts
4. Delta Vertex
The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z) (Cuts of vz =30 cm , NFitCut= 25 are applied)
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
7. Pt
Embedded Rho meson with flat pt (black)and Recosntructed Pion (red).
8. Randomness Plots
The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.
Some QA plots for Rho:
MiniDst files are at PDSF under the path /eliza13/starprod/embedding/p06ib/MiniDst/rho_101/*.minimc.root
MuDst files are at PDSF under /eliza13/starprod/embedding/P06ib/MuDst/10*/*.MuDst.root Reconstruction had been done on PionPlus.
DCA and Nfit Distributions had been scaled by the Integral in different pt ranges
Some QA Plots for D0 located under the path :
/eliza12/starprod/embedding/P06ib/D0_001_1216876386/*
/eliza12/starprod/embedding/P06ib/D0_002_1216876386/ -> Directory empty
Global pairs are used as reconstructed tracks. SOme quality tractst plotiing level were :
Vz Cut 30 cm ;
NfitCut : 25,
Ncommonhits : 10 ;
maxDca :1 cm ; Assuming D0- >pi + pi
QA P08ic J/Psi -> ee+
This is the QA for P08id (jPsi - > ee). Reconstructin on Global Tracks and just Electrons (Positrons).
1. dEdx
2. DCA Distributions
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
3. NFit Distributions
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
4. Delta Vertex
The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z)
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
7. Pt
Embedded J/Psi with flat pt (black)and Recosntructed Electrons (red).
8. Randomness Plots
The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.
AXi-> Lamba + Pion + ->P + Pion - + Pion +
(03 08 2009)
1. Dedx
2.Dca
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
3. Nfit
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
4. Delta Vertex
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
7. pt
8. Randomness
QA of Lambda Embedding with run 8 d+Au on PDSF (sample 15x)
Let's first check some event wise information. They look fine.
Then we check the randomness of the input Monte Carlo (MC) Lambda tracks. The 'phasespace' command in GSTAR is used for sampling the MC tracks. The input is supposed to be flat for pT within [0,8], Y [-1.5,1.5] and Phi [-Pi,Pi]. The 3 plots below show the randomness is OK for this sample. Please notice that Y is rapidity, not pseudo-rapidity.
Then we compare the dedx of reconstructed MC (matched) global tracks (i.e. the daughters of MC Lambda) to those of real (or ghost) tracks, to fix the scale factor. (scale factor = 1.38 ?)
Now we compare the nFitHits distribution of matched global tracks (i.e. the daughters of MC Lambda) and real tracks. The cuts are |eta|<1, nFitHits>25. For matched tracks, nCommonHits>10 cut is applied. From the left plot, we can see, the agreement of nHitFits is good for all pT ranges.
We check the pT, rapdity and Phi distributions of reconstruced (RC) Lambda and input (MC) Lambda. The cut for Lambda reconstruction is very loose. They look normal.
Here, we compare some cut variables from the reconstructed (RC) Lambda to those from real Lambda. Again, as you can see in these plots, the cuts are very loose for Lambda (contribution of background is estimated with rotation method, and has been subtracted). These plots are made for 8 pT bins (with rapidity cut |y|<0.75). The most obvious difference is in DCA of V0 to PV, especially for high pT bin.
Omega-> Lamba + K - -> P + Pion - + K -
(03 08 2009)
1. Dedx
Reconstruction on pion Minus and Proton Daugthers. 2 different plots are shown just for the sake of completenees.... Reconsructing on Kaon had very few statistics
Reco PionMinus | Reco Proton |
2.Dca
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
*.Reconstructing on Pion
*. Reconstructing on Proton
3. Nfit
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
*. Reconstructing Pion
*. Reconstructing Proton
4. Delta Vertex
When reconstructed in Pion Minus and Proton it turns out to have the same ditreibutions so I just posted one of them
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
Reco Pion | Reco Kaon |
7. pt
8. Randomness
QA Phi->KK (March 05 2009)
1. Dedx
Reconstruction on Kaon Daugthers. 2 different plots are shown just for the sake of completenees....
2.Dca
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
3. Nfit
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
4. Delta Vertex
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
7. pt
8. Randomness
QA P08id (Phi->KK)
This is the QA for P08id (phi - > KK). econstructin on Global Tracks and just Kaons. Macro from Xianglei used (I found the QA macro very familiar). scale factor of 1.38 applied.
1. dEdx
Reconstruction on Kaon Daugthers. 2 different plots are shown just to see how Montecarlo looks on top of the Ghost Tracks
2. DCA Distributions
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
3. NFit Distributions
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
4. Delta Vertex
The following are the Delta Vertex ( Vertex Reconstructed - Vertex Embedded) plot for the 3 diiferent coordinates (x, y and z)
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
7. Pt
Embedded Phi meson with flat pt (black)and Recosntructed Kaons (red).
8. Randomness Plots
The following plots, are to check the randomness of the input Monte Carlo (MC) tracks.
QA ALambda->P, pi (03 08 2009)
1. Dedx
Reconstruction on Proton and Pion Daugthers. 2 different plots are shown just for the sake of completenees....
Reco Proton | Reco Pion |
2.Dca
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
*.Reconstructing on Proton
*. Reconstructing on Pion
3. Nfit
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
*. Reconstructing Proton
*. Reconstructing Pion
4. Delta Vertex
When reconstructed in Proton and pions it turns out to have the same ditreibutions so I just posted one of them
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
Reco Proton | Reco Pion |
7. pt
8. Randomness
Xi-> Lamba + Pion - ->P + Pion - + Pion -
(03 08 2009)
1. Dedx
Reconstruction on PI Minus
2.Dca
An original 3D histogram had been created and filled with pT, Eta and DCA as the 3 coordinates. Projection on PtBins and EtaBins had been made to create this "matrix" of plots.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
3. Nfit
Similarly An Original 3D Histogram had been created with Pt, Eta and Nfit as coordinates. Respective projections had been made in the same pT and Eta Bins as the DCa distributions.
Pt Bin array used : { 0.5, 0.6, 0.8, 1.0} (moves down) and
Eta Bin array : {0.2, 0.5, 0.8, 1.0} (moves to the right)
For the Error bars, i used the option hist->Sumw2();
4. Delta Vertex
5. Z Vertex and X vs Y vertex
6. Global Variables : Phi and Rapidity
7. pt
8. Randomness
Please find the QA plots here
The data looks good.
Reconstructed phi meson has small rapidity dependence.
QA PionMinus
These are dedx vs P graphs. All of them show reasonable agreement with data.(Done on May 2002)
Pi Minus | K Minus | Proton |
Pi Plus | K Plus | P Bar |
PI PLUS. In the following dca distributions some discrepancy is shown. Due to secondaries?
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | MinBias 0.2 GeV/c < pT < 0.3 Gev/c | Central 0.3 GeV/c <pT < 0.4 GeV/c |
PI MINUS. Some discrepancy is shown. Due to secondaries?
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | MinBias 0.2 GeV/c < pT < 0.3 Gev/c | Central 0.3 GeV/c <pT < 0.4 GeV/c |
K PLUS. In the following dca distributions Good agreement with data is shown.
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | MinBias 0.2 GeV/c < pT < 0.3 Gev/c | Central 0.3 GeV/c <pT < 0.4 GeV/c |
K MINUS. In the following dca distributions Good agreement with data is shown.
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | MinBias 0.2 GeV/c < pT < 0.3 Gev/c | Central 0.3 GeV/c <pT < 0.4 GeV/c |
PROTON. The real data Dca distribution is wider, especially at low pT -> Most likely due to secondary tracks in the sample. A tail from background protons dominating distribution at low pt can be clearly seen. Expected deviation from the primary MC tracks.
Peripheral 0.2 GeV/c < pT < 0.3 Gev/c | MinBias 0.5 GeV/c < pT < 0.6 Gev/c | Central 0.7 GeV/c <pT < 0.8 GeV/c |
Pbar. The real data Dca distribution is wider, especially at low pT -> Most likely due to secondary tracks in the sample.
Peripheral 0.2 GeV/c < pT < 0.3 Gev/c | MinBias 0.3 eV/c < pT < 0.4 Gev/c | Central 0.5 GeV/c <pT < 0.6 GeV/c |
PI PLUS. Good agreement with data is shown.
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | MinBias 0.2 GeV/c < pT < 0.3 Gev/c | Central 0.3 GeV/c <pT < 0.4 GeV/c |
PI MINUS. Good agreement with data is shown.
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | Peripheral 0.2 GeV/c < pT < 0.3 Gev/c | Peripheral 0.3 GeV/c < pT < 0.4 Gev/c |
K PLUS. Good agreement with data is shown.
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | MinBias 0.2 GeV/c < pT < 0.3 Gev/c | Central 0.3 GeV/c <pT < 0.4 GeV/c |
K MINUS. Good agreement with data is shown.
Peripheral 0.1 GeV/c < pT < 0.2 Gev/c | Peripheral 0.2 GeV/c < pT < 0.3 Gev/c | Peripheral 0.3 GeV/c < pT < 0.4 Gev/c |
PROTON. Good agreement with data is shown.
Peripheral 0.2 GeV/c < pT < 0.3 Gev/c | Peripheral 0.3 GeV/c < pT < 0.4 Gev/c | Peripheral 0.5 GeV/c < pT < 0.6 Gev/c |
Pbar. Good agreement with data is shown.
Peripheral 0.2 GeV/c < pT < 0.3 Gev/c | Peripheral 0.3 GeV/c < pT < 0.4 Gev/c | Peripheral 0.5 GeV/c < pT < 0.6 Gev/c |
Cross reference to Reconstruction Code QA
STAR Computing | |
STAR Reconstruction Software | |
Hypernews forums: Tracking/Fitting Event Vertex/Primaries Subscribe | Y. Fisyak, S. Margetis, C. Pruneau |
SVT Alignment and June 2006 review
Summary pages
SVT+SSD Alignment, Run VII (2007)
Alignment
Software for Tracking Upgrade (Challenge week 01/17/06-01/24/06)
Agenda Result of Kolmogorov's tests for QA histograms for different versions of STAR codes
New STAR magnetic field
When and what dEdx Prediction for P10 has to be used
Reconstruction plans
Usage of Truth information in reconstruction.
STAR track flags.
ITTF
integration week January 2004 p-p software workshop at BNL 2001/11/19
Final Agenda and Talks . Minutes from the meeting are here. Integrated Tracking Task Force (ITTF)
The official web cite maintained by Claude Pruneau is here .
See also the STAR Kalman filter documentation created by Claude Pruneau. LBL Tracking review report
Available in MS-WORD and PDF format. Some talks given are linked here. Kalman in BABAR
A note on Kalman is here in .ps format. Kalman in ATLAS
Igor Gavrilenko's presentation for 5/22/00 in power point format, ATLAS internal note with xKalman description.
Spiros talk on video meeting about it on June/2/2000 is here in power point format. Flagging scheme for tracks failing Kalman fitter (Al Saulys) Kalman Fitter Evaluation page-I (Al Saulys) Kalman Fitter Evaluation page-II (Lee Barnby) Kalman Fitter for V0 search (Al Saulys) Kalman in STAR
A preliminary writeup of the Kalman implementation in STAR in use during 2001. Current Work on Tracking/Fitting Tools
The group is currently looking into some of the options for improving the global tracking and fitting. These options include an implementation of GEANE as a universal track propagation engine, providing an interface between tracking and geometry/material info, and using Kalman filtering techniques to obtain the best estimation of track parameters. Kalman Tracking/Fitting Tools
Kalman literature (Spiros Margetis)
This section relates to vertex finder algorithm in STAR and some model / approach and evaluation results. Vertex finder studies have been historically part of PWG activities under lose technical guidance from S&C, providing framework and a generic approach to include / add more algorithm as our understanding gwo with time.
Performance of ppLMV- historic note from 2001, by Jan
References for vertex finder review:
Event Reconstruction Techniques in NOvA (CHEP 2015)
http://indico.cern.ch/event/304944/session/2/contribution/277/attachments/578475/796605/chep_reconstruction_2015.pdf
Vertex finding by sparse model-based clustering (ACAT 2016)
https://indico.cern.ch/event/397113/session/22/contribution/209/attachments/1215150/1774584/ACAT2016_RF.pdf
Vertex Reconstruction in the ATLAS Experiment at the LHC
http://cds.cern.ch/record/1176154/files/ATL-INDET-PUB-2009-001.pdf
Efficiency of Primary Vertex Reconstruction ATLAS
http://www.phy.bnl.gov/~partsem/fy12/kirill_slides.pdf
Summary
J. Lauret, V. Perevoztchikov, D. Smirnov, G. Van Buren, J. C. Webb
November 18, 2015
The hard coded limit on the number of "bad" vertices has been raised from 5 to 150 in PPV
November 12, 2015
Here we looked at a few basic distributions for event observables to see if the embedding sample is consistent with the data. The intention is to understand why PPV and KFV relative performance is reversed in embedding and data samples.
PPV Embedding | PPV Data | KFV Embedding | KFV Data |
---|---|---|---|
November 11, 2015
In this test we made sure to use the same primary vertex cuts in PPV finder as in the original W analysis. As the result the average efficiency increased from 0.62 to 0.76. It is still lower than the KFV efficiency of 0.81 (0.87) (see below0
November 10, 2015
In the code calculating the impurity, reconstructed verticies which do not have a matching MC vertex were incorrectly ignored from the total count. After fixing this the "red" and "green" curves now add up to 1 as expected.
PPV (left) vs KFV (right)
November 5, 2015
Removed requirement on the minimum value of the Max Rank vertex rank (<0). PPV (left) vs KFV (right)
November 1, 2015
Results from the new 2013 W embedding samples: PPV (left) vs KFV (right)
The file list used for this embedding sample is: filelist_wbos_embed.txt
October 6, 2015
The following plots show vertex finding efficiencies for PPV (left) and KFV (right) as determined from a 50k event sample of Pythia simulated W-boson events without pileup located at:
/star/institutions/bnl_me/smirnovd/public/w_sim_nopileup_fzd/ /star/institutions/bnl_me/smirnovd/public/w_sim_nopileup_ppv/ /star/institutions/bnl_me/smirnovd/public/w_sim_nopileup_kfv/
The following options were used to reconstruct the samples
BFC_OPTIONS="fzin tpcRS y2014a AgML pxlFastSim istFastSim usexgeom FieldOn MakeEvent VFPPVnoCTB beamline Sti NoSsdIt NoSvtIt StiHftC TpcHitMover TpxClu Idst BAna l0 Tree logger genvtx tpcDB bbcSim btofsim tags emcY2 EEfs geantout evout -dstout IdTruth big clearmem" BFC_OPTIONS="fzin tpcRS y2014a AgML pxlFastSim istFastSim usexgeom FieldOn MakeEvent KFVertex beamline Sti NoSsdIt NoSvtIt StiHftC TpcHitMover TpxClu Idst BAna l0 Tree logger genvtx tpcDB bbcSim btofsim tags emcY2 EEfs geantout evout -dstout IdTruth big clearmem"
September 24, 2015 Updated: October 1, 2015
The following plots show vertex finding efficiencies for PPV (left) and KFV (right) as determined from a 50k event sample of Pythia simulated W-boson events without pileup located at:
/star/institutions/bnl_me/smirnovd/public/amilkar/MuDst/ /star/institutions/bnl_me/smirnovd/public/amilkar/PPV2012/
The distribution for KFV is somewhat comparable to the 2011 and 2012 results shown below
The PPV case was reconstructed without the 'beamline' option.
September 14, 2015
The following plot with vertex finding efficiencies (default = PPV) was created using the refactored code from Amilkar (github.com/star-bnl/star-travex)
Here I used 10k events from Run 13 W-boson Pythia embedding simulation from Jinlong located at:
/star/data19/wEmbedding2013/pp500_production_2013/Wminus-enu_100_20145001/P14ig.SL14g/2013/
Comparing to the 2011 and 2012 results shown below the efficiency appears to be slightly better for lower multiplicity vertices. The overall average efficiency is slightlyt higher 0.50 vs 0.46
August 05, 2015
Questions:
What exactly is the difference between 2011 and 2012 years?
Why KFV shows significantly different efficiency for 2011 and 2012?
From the above right handside plots: Does it actually mean that the KFV ranking works and works better than the PPV one?
KFV does give lower efficiency for low multiplicity vertices than PPV But this is with no pileup! Could this explain the 10% loss in the W efficiency? See below for the case with pileup
Questions:
From the above plots it does look that KFV also outperforms PPV even at low multiplicities with pileup.
TMVA ranking is better than default one?
July 22, 2015
/star/data23/reco/pp500_production_2013/ReversedFullField/P15ic_VFPPV/2013/ /star/data26/reco/pp500_production_2013/ReversedFullField/P15ic_KFvertex_BL/2013/More details can be found in the following email from Lidia:
http://www.star.bnl.gov/HyperNews-star/protected/get/starprod/648/1/1/1/1/1/1/3/1.html
The summary: We compared the output yields of the W analyses and found that KFV finds about 10% less W events than the PPV finder. Although, in the standard W analysis (using PPV) the considered vertices required to have a positive rank we removed that requirement and let the framework to consider ALL vertices found by KFV
References:
PPV and KFVertex performance comparison based on simulation for y2011 & y2012 pp200 with pile-up - Amilkar/Jonathan/Yuri
We want to answer the following questions in this prioritize order.
For now use the following BFC chain:
"DbV20080712 pp2008 ITTF OSpaceZ2 OGridLeak3D beamLine VFPPVnoCTB debug logger"
Run 8 (200 GeV dAu and pp) , by Gene
Status: Evaluation of current BFC, no changes to PPV code yet
Run #9069005 will be used
http://online.star.bnl.gov/RunLog/Summary.php?run=9069005
Trigger Name | Trigger ID | # Events | daq source file | needed daq files | expected # triggers |
zerobias | 9300 | 525 | st_zerobias | all | 525 |
toftpx | 220710 | 354459 | st_toftpx | 5 | 90000 |
fms-slow | 220981 | 29914 | st_fmsslow | 20 | 20000 |
bbc | 220000 | 2646 | st_physics | all | 2646 |
bh1-mb | 220510 | 8612 | st_physics | all | 8612 |
etot-mb-l2 | 7 | 2853 | st_physics | all | 2853 |
jp1-mb-l2 | 8 | 5676 | st_physics | all | 5676 |
bh2-mb-slow | 220520 | 14236 | st_physics | all | 14236 |
daq files are located at
/star/data03/daq/2008/069/
9069005f 9069005t 9069005z 9069005p
bfc.C will be run in stardev with options:
root4star -b -q bfc.C'(1,1e6,"DbV20080820 pp2008 ITTF OSpaceZ2 OGridLeak3D beamLine VFPPVnoCTB debug logger","/star/data03/daq/2008/069/*")'
The verision of PPV includes the August 2008 change that Post Crossing Tracks are dropped which was enacted in response to the change of the TPC cluster finder
MuDst files will be placed at:
/star/data05/scratch/rjreed/PPV2008Eval/*
Observables to Monitor:
# primary vertices
Z location primary vertices
delta Z between primary vertex z position and bbc or vpd z position
# tracks associated with each primary vertex
Cuts to Monitor:
mMinTrkPt (Currently 0.20)
mMinFitPfrack (Currently at 0.70)
Include all EEMC rings
mMaxZradius
Weights for TPC, EEMC, BEMC
PPV performance Revision 1.29
Updated 9/7/2008
Trigger Name | Trigger ID # | # Events Expected | # Events Run |
zerobias | 9300 | 525 | 525 |
toftpx | 220710 | 90000 | 28720 |
fms-slow | 220981 | 20000 | 14299 |
bbc | 220000 | 2646 | 938 |
bh1-mb | 220510 | 8612 | 7507 |
etot-mb-l2 | 7 | 2853 | 2506 |
jpt-mb-l2 | 8 | 5676 | 4983 |
bh2-mb-slow | 220520 | 14236 | 12507 |
Table 2: Summary of Vertex finding efficiency and vertex matching with events processed by Sept 2. For the vpd, matched is defined as the 0 ranked PPV vertex is within 8 cm of the vpd vertex. For the bbc, matched is defined as the 0 ranked PPV vertex is within 32 cm of the bbc vertex.
Table 3: Summary of Vertex finding efficiency and vertex matching with events processed by Sept 7. For the vpd, matched is defined as the 0 ranked PPV vertex is within 20 cm of the vpd vertex. For the bbc, matched is defined as the 0 ranked PPV vertex is within 60 cm of the bbc vertex.
zero bias
525 Events
# vertices,#events
0,461
1,62
2,2
5 Events with vpd + PPV vertex
15 Events with bbc + PPV vertex
Figure 2: Z postition of rank 0 vertex for zero bias trigger. Rank 1 and above excluded due to low statistics.
Figure 3: Rank 0 PPV vertex Vz - vpd Vz for zero bias trigger.
toftpx
28720 Events
# Vertices, # Events
0,28110
1,565
2,39
3,6
559 Events with PPV rank 0 + vpd
393 Events match (within 8 cm)
41 Events with PPV rank 1 + vpd
7 Events match (within 8 cm)
558 Events with PPV rank 0 + bbc
428 Events match (within 32 cm)
44 Events with PPV rank 1 + bbc
20 Events match (within 32 cm)
509 Events with PPV rank 0 + bbc + vpd
296 Events match both vpd and bbc
Figure 5: Z position of rank 0 and rank 1 vertices for tof trigger.
Figure 6: Rank 0 PPV Vz - vpd Vz for tof trigger
Figure 7: Rank 0 PPV Vz - bbc Vz for the tof trigger
Figure 8: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the tof trigger.
fms-slow
#Vertices,#Events
0,5504
1,7503
2,1173
3,110
4,8
5,1
2162 Events with PPV rank 0 + vpd
1435 Events match (within 8 cm)
389 Events with PPV rank 1 + vpd
98 Events match (within 8 cm)
6679 Events with PPV rank 0 + bbc
4617 Events match (within 32 cm)
1026 Events with PPV rank 1 + bbc
453 Events match (within 32 cm)
1954 Events with PPV rank 0 + bbc + vpd
992 Events match both vpd and bbc
Figures to be added later.
bbc
2646 Events
# Vertices, # Events
0,1157
1,1293
2,183
3,12
500 Events with PPV rank 0 + vpd
392 Events match (within 20 cm)
71 Events with PPV rank 1 + vpd
27 Events match (within 20 cm)
1409 Events with PPV rank 0 + bbc
1232 Events match (within 60 cm)
185 Events with PPV rank 1 + bbc
114 Events match (within 60 cm)
484 Events with PPV rank 0 + bbc + vpd
360 Events match both vpd and bbc
Figure 10: Vz position of PPV rank 0 and rank 1 vertices for the bbc trigger.
Figure 11: PPV Vz - vpd Vz for both rank 0 and rank 1 vertices for the bbc trigger.
Figure 12: PPV Vz - bbc Vz for both rank 0 and rank 1 vertices for bbc trigger
Figure 13: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the bbc trigger.
bh1-mb
7507 Events
# Vertices, # Events
0,856
1,5494
2,1032
3,122
4,3
1761 Events with PPV rank 0 + vpd
1452 Events match (within 20 cm)
356 Events with PPV rank 1 + vpd
120 Events match (within 20 cm)
6275 Events with PPV rank 0 + bbc
5729 Events match (within 60 cm)
1082 Events with PPV rank 1 + bbc
676 Events match (within 60 cm)
1699 Events with PPV rank 0 + bbc + vpd
1309 Events match both vpd and bbc
Figure 15: Vz position of PPV rank 0 and rank 1 vertices for bh1 trigger
Figure 16: PPV Vz - vpd Vz for Rank 0 and Rank 1 vertices for bh1 trigger.
Figure 17: PPV Vz - bbc Vz for rank 0 and rank 1 vertices for bh1 trigger
Figure 18: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the bh1 trigger.
etot-mb-l2
2506 Events
# Vertices, # Events
0,95
1,1768
2,539
3,96
4,7
549 Events with PPV rank 0 + vpd
380 Events match (within 20 cm)
204 Events with PPV rank 1 + vpd
61 Events match (within 20 cm)
2252 Events with PPV rank 0 + bbc
2013 Events match (within 60 cm)
599 Events with PPV rank 1 + bbc
367 Events match (within 60 cm)
530 Events with PPV rank 0 + bbc + vpd
339 Events match both vpd and bbc
Figure 20: Vz position of rank 0 and rank 1 vertices for etot trigger
Figure 21: PPV Vz - vpd Vz for rank 0 and rank 1 vertices for etot trigger
Figure 22: PPV Vz - bbc Vz for rank 0 and rank 1 vertices for etot trigger
Figure 23: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the etot trigger.
jpt-mb-l2
4983 Events
# Vertices, # Events
0,240
1,3751
2,860
3,120
4,12
1114 Events with PPV rank 0 + vpd
869 Events match (within 20 cm)
292 Events with PPV rank 1 + vpd
84 Events match (within 20 cm)
4419 Events with PPV rank 0 + bbc
4040 Events match (within 60 cm)
927 Events with PPV rank 1 + bbc
553 Events match (within 60 cm)
1072 Events with PPV rank 0 + bbc + vpd
777 Events match both vpd and bbc
Figure 25: Vz position of rank 0 and rank 1 vertices for jp1 trigger.
Figure 26: PPV Vz - vpd Vz for jp1 trigger
Figure 27: PPV Vz - bbc Vz for jp1 trigger
Figure 28: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the jp1 trigger.
bh2-mb-slow
12507 Events
# Vertices, # Events
0,1292
1,9168
2,1830
3,209
4,7
2970 Events with PPV rank 0 + vpd
2402 Events match (within 20 cm)
649 Events with PPV rank 1 + vpd
216 Events match (within 20 cm)
10555 Events with PPV rank 0 + bbc
9625 Events match (within 60 cm)
1929 Events with PPV rank 1 + bbc
1200 Events match (within 60 cm)
2861 Events with PPV rank 0 + bbc + vpd
2164 Events match both vpd and bbc
Figure 20: Vz position of rank 0 and rank 1 vertices for etot trigger
Figure 26: PPV Vz - bbc Vz for jp1 trigger
Figure 23: Delta Vz(PPV - vpdVz) vs delta Vz(PPV-bbc) for the etot trigger.
Figure 27: PPV Vz - vpd Vz for jp1 trigger
This compares the effiiciency of the current PPV vertex finder (revision 1.29 which includes PCT fix) to the suggested change of including tracks with 0.1 GeV < pT < 0.2 GeV to the algorithim. Currently PPV only uses tracks with pT > 0.2 GeV. Since tracks with 0.1 GeV will not reach the BEMC and would be rejected with the PPV algorithim, we've included only tracks with 0.1 < pT <0.2 which cross the central membrane and we did not require them to point to a tower.
The bfc code is at:
/star/data05/scratch/rjreed/PPV2008Eval/Vertex5/
The daq files processed as of 9/7/2008 are at:
/star/data05/scratch/rjreed/PPV2008Eval/MuDstPt100/
Table 1: Efficiency comparison between PPV revision 1.29 and altered PPV which accepts lower pT tracks. Matching with the vpd is defined as the rank 0 PPV vertex is with 20 cm of the vpd vertex, and matching with the bbc is defined as the rank 0 PPV vertex is within 60 cm of the bbc vertex.
Table 2: Break down of the number of events used to calculate the efficiencies of the altered PPV vertex algorithim.
Table 3: PPV revision 1.29 statistics posted here for ease of comparison
The following data was taken with triggers: bh1-mb (ID 220510), etot-mb-l2 (ID 7). jpt-mb-l2 (ID 8), and bh2-mb-slow (ID 220520). The total number of events in this set was 16351.
Figure1 : Delta Vz between PPV Vertex closest to the VPD vertex and the vpd. PPV ranking was ignored.
Figure 2. Delta Vz between second closest PPV vertex and vpd vertex.
Figure 3: This is figure 1 fitted with an unrestrained gaussian.
Figure 4: This is Figure 1 and 2 (red) plotted on top of each other with a bin width of 6.3 cm (2 times the sigma fitted in Figure 3.)
BFC
"DbV20080712 pp2008 ITTF OSpaceZ2 OGridLeak3D beamLine >VFPPVnoCTB debug logger"
star/data05/scratch/balewski/2008-daq/st_fmsslow_9060086_raw_1090001.MuDst.root
Detailed event count
1271 Events , 256 Events with no PPV vertex, 271 Events with vpd vertex, 144 with at least one PPV vertex matching the vpd, 136 events with the 0 ranked vertex matching the vpd, 10 Events with the 1 ranked vertex matching the vpd , 49 Events with no PPV vertex and a vpd Vertex, 5 Events with multiple vertices matching the vpd , 1 Event with 2 ranked Vertex matching the vpd (multiple match), 1 Event with 3 ranked vertex matching the vpd (multiple match)44 Events out of 222 with both PPV and vpd vertex have all vertices
outside of 50 cm of vpd. 11 of these have more than 1 PPV vertex
Conclusion: out of 271 events with VPD vertex:
VertexAnalysis8cm.txt file lists (for the events with a vpd vertex) the event #, # of vertices, rank of each vertex that matches vpd.
I've run through 5 events where the vpd and the PPV vertices don't match. daq file location is:
star/data05/scratch/balewski/2008-daq/st_fmsslow_9060086_raw_1090001.daq
Here are the PPV Vz values prior to "forcing" the values. For the histograms, the solid red lines indicate the location of the vpd vertex and the circles indicate the location(s) of the PPV vertices.
EventID = 749 vpdVz = 98.2939 PPV vertices at Vz = -48.75 140.85
EventID = 2115 vpdVz = -34.348 PPV vertices at Vz = -106.55
EventID = 3346 vpdVz = -33.7838 PPV vertices at Vz = 129.35
EventID = 3952 vpdVz = 0.664319 PPV vertices at Vz =
EventID = 4447 vpdVz = 25.5813 PPV vertices at Vz = -99.95
For comparison, here is an event where the VPD vertex and PPV vertex were within 8 cm of each other:
All the important files (including the likelihood histograms and track multiplicities) can be found at:
/star/u/rjreed/Vertex3/Eventdaq*
Here are the results:
daq #7 EventID = 749 vpdVz = 98.2939 # PPV vertices = 10 at Vz = 90 92 94 96 98 100 102 104 106 108
Primaries that passed the cut (flag>0, pt>0.2GeV, nFitP/nPoss>0.51):
id = 277 flag = 801 Nhits = 7 Npos = 11 pt = 0.370926 frac = 0.636364
id = 278 flag = 801 Nhits = 6 Npos = 11 pt = 0.250512 frac = 0.545455
id = 279 flag = 801 Nhits = 6 Npos = 11 pt = 0.271426 frac = 0.545455
id = 281 flag = 801 Nhits = 8 Npos = 11 pt = 1.57357 frac = 0.727273
id = 283 flag = 801 Nhits = 6 Npos = 11 pt = 0.638242 frac = 0.545455
id = 292 flag = 801 Nhits = 5 Npos = 9 pt = 0.292521 frac = 0.555556
id = 295 flag = 801 Nhits = 7 Npos = 11 pt = 0.213841 frac = 0.636364
daq #22 EventID = 2115 vpdVz = -34.348 N PPV vertices = 10 at Vz = -42 -40 -38 -36 -34 -32 -30 -28 -26 -24
Primaries that passed the cut:
id = 48 flag = 301 Nhits = 40 Npos = 45 pt = 0.362589 frac = 0.888889
id = 63 flag = 301 Nhits = 26 Npos = 34 pt = 0.602728 frac = 0.764706
id = 64 flag = 301 Nhits = 35 Npos = 45 pt = 0.392374 frac = 0.777778
id = 66 flag = 301 Nhits = 32 Npos = 45 pt = 0.772336 frac = 0.711111
id = 288 flag = 801 Nhits = 6 Npos = 11 pt = 2.56192 frac = 0.545455
id = 290 flag = 801 Nhits = 9 Npos = 11 pt = 0.206602 frac = 0.818182
id = 291 flag = 801 Nhits = 10 Npos = 11 pt = 0.556056 frac = 0.909091
id = 292 flag = 801 Nhits = 8 Npos = 11 pt = 0.722384 frac = 0.727273
id = 296 flag = 801 Nhits = 6 Npos = 11 pt = 0.236179 frac = 0.545455
id = 297 flag = 801 Nhits = 6 Npos = 11 pt = 0.267664 frac = 0.545455
id = 298 flag = 801 Nhits = 6 Npos = 11 pt = 0.9994 frac = 0.545455
id = 299 flag = 801 Nhits = 5 Npos = 8 pt = 0.332944 frac = 0.625
id = 300 flag = 801 Nhits = 9 Npos = 11 pt = 0.389091 frac = 0.818182
id = 310 flag = 801 Nhits = 5 Npos = 9 pt = 0.236941 frac = 0.555556
daq #44 EventID = 3346 vpdVz = -33.7838 N PPV vertices = 10 at Vz = -42 -40 -38 -36 -34 -32 -30 -28 -26 -24
Primaries that passed the cut:
id = 129 flag = 301 Nhits = 38 Npos = 42 pt = 0.352418 frac = 0.904762
id = 433 flag = 801 Nhits = 6 Npos = 9 pt = 0.73871 frac = 0.666667
id = 436 flag = 801 Nhits = 6 Npos = 11 pt = 4.33162 frac = 0.545455
id = 441 flag = 801 Nhits = 7 Npos = 11 pt = 1.19905 frac = 0.636364
id = 446 flag = 801 Nhits = 6 Npos = 11 pt = 2.52832 frac = 0.545455
daq #63 EventID = 3952 vpdVz = 0.664319 N PPV vertices = 10 at Vz = -8 -6 -4 -2 0 2 4 6 8 10
Primaries that passed the cut:
id = 177 flag = 301 Nhits = 35 Npos = 40 pt = 0.751434 frac = 0.875
id = 433 flag = 801 Nhits = 7 Npos = 11 pt = 1.16509 frac = 0.636364
id = 437 flag = 801 Nhits = 6 Npos = 10 pt = 0.277501 frac = 0.6
id = 440 flag = 801 Nhits = 6 Npos = 11 pt = 6.03469 frac = 0.545455
daq #70 EventID = 4447 vpdVz = 25.5813 N PPV vertices = 10 at Vz = 18 20 22 24 26 28 30 32 34 36
Primaries that passed the cut:
id = 324 flag = 311 Nhits = 9 Npos = 10 pt = 0.50257 frac = 0.9
id = 391 flag = 801 Nhits = 6 Npos = 11 pt = 0.332764 frac = 0.545455
-------- Loose notes, needs cleanup, Jan
Low Lumi (run9060086), Mid Lumi (run9069059), High Lumi (run 9068124)
/star/data10/reco/ppProduction2008/ReversedFullField/P08ic_test/2008/*/*
log files on /star/rcf/prodlog/P08ic_test/log/daq
--------------
Akio's event classification txt file from 3 files (with differnt luminosity) at the bottom of
http://www.star.bnl.gov/protected/spin/akio/200806/index_5th.html
from Lidia's test production. I checked that I get identical results
with Jan's production and Lidia's for the low lumi run.
-------------
Just remind you that there is "accidental match" between VPD and
TPC vertex up to ~25% (@ high lumi) under the peak.
--------------
* could you show Rosi how to print VPD & BBC vertex position in BFC?
Once Mudst is created, you can get VPD vertex (from TOF electronics) by
StMuEvent->vpdVz()
For BBC I have not yet implemented the calibrated vertex. Once done,
one should be able to get it from
StEvent->triggerDetectorCollection()->bbc()->zVertex()
--------- Rosi ---------
While I was doing this, I spun a macro over the MuDst in the folder
above and duplicated some of Akio's results, just to make sure I
understand.
So here are the z positions of the ranks 1 and 2 vertices and the vpd:
Fig 1.
August 13, 3 runs produced with PPV w/o using CTB
-----
Fig 1. VPD-Minuit, d-Au 2008 events, minB events: ZDC East+VPD
Fig 2. VPD-PPV, p-p 2008 events, st_physics , no trig selection, mostly E & BEMC triggers
Hi Jan, Akio, Xiaoping helped me to check the test 2008 pp production data Jan suggested. Please take a look at the attached plot. Basically what is plotted here is vz difference between vpd vertex and tpc vertex for different vpd hit configuration, similar to that in Akio's web page, but using the TOF electronics for vpd hit configuration selection. So firstly we didn't see the ~30cm width gaussian component. The two narrow gaussian components are attributed to the VPD resolution. In the VPD timing resolution, we see a double gaussian structure, and with about a factor of 2-3 difference in widths. This is consistent with what we see here. The largest ~50-60 cm gaussian component should be due to the fail of TPC VF and it is related to the beam vz distribution. And secondly, the resolution in the (E,W)>(1,1) configuration is expected better than the configuration of (E,W)=(1,1). The "dilution" in vpd resolution with more hits seems not true to us. Generally, we don't see an obvious issue on the VPD side. I am not sure if how the result will change when you use the DSM for selection. Or maybe your statistics is not good enough? Or the data are from some bad TOF runs? Thanks and Best Regards /xin Jan Balewski wrote: Hi Xin, Those 2 analysis do not need to be contradicting. There is much less pileup in dAu than in pp. There may be beam background in pp. Can you investigate this effect in 2008 pp data from production requested by Matt ~2 weeks ago, it is done, files are on data09,10. http://www.star.bnl.gov/HyperNews-star/get/starprod/249/4/1/1/1/1/2.html It is 3K daq files, 1M events w/ TPC , 1/4 of events have VPD vertex, ~90 % have TPC vertex produced by fixed PPV. Thanks Jan On Oct 1, 2008, at 11:40 PM, Xin Dong wrote: Hi Akio, Thanks for this message. Actually we always see the resolution will be improved if we require more VPD hits. I don't quite understand the ~25cm gaussian distribution at this step. Xiaoping helped me check the dAu data (we don't have TPC vertex in pptoftpx triggered data), you can find the distribution from the attached plot. It shows that with more VPD hit requirement, the vertex resolution is better. No 25cm-width gaussian contribution appears. So let me answer your questions directly, see them inline. Akio Ogawa wrote: Hello I posted this yesterday to vertex mailing list. I'd like to make sure you know this since you may be more intersted than us. In zVPD-zTPC distribution at pp, we see 3 structures. See Fig 2 (2 gaussian fit) and Fig6 (3 gaussian fit) of http://drupal.star.bnl.gov/STAR/blog-entry/rjreed/2008/sep/11/ppv-revision-1-29-high-luminosity-fmsslow-trigger-evaluation First is sigma ~3cm peak where TPC and VPD vertex matches. Quite reasonable with resolutions of those two vertex finding. 2nd is sigma ~80cm, which is understandable if TPC and VPD picked up 2 different vertex. Vertex distribution is ~gaussian with sigma ~60cm. If we pick 2 randomly and take difference, then sigma should be sqrt(2)*60cm ~ 85cm. 3rd one is the mistery. Sigma is around 25-30cm. So its much narrower than random. Its hard for TPC to "miss" vertex by 10-20cm, since all track's DCA_z is <3cm. Rosi changed selection of TPC vertex (more matched tracks) to make TPC vertex better, she saw no difference in the structure. Now if look at the plot at bottom of http://www.star.bnl.gov/protected/spin/akio/200806/index_8th.html which is essentially same plot but divided by # of VPD hits. This 3rd structure with sigma~25cm is most evident when both VPD-E and VPD-W has 2 or more hits. This suggests (at least to me) that when you have more than one hit in VPD and taking time average, sometimes you are diluting VPD vertex resolution. This can be "real" (another collisions in the same crossing or some beam halo hitting VPD) or detector effects (hot pmt, too loose timing cut, etc). Have you seen this? ==>Xin No. Is there way to get some more info from mudst? ==>Xin The number of hits and Tdiff information should be available from MuDst. Tdiff cut may help some in resolution, but shouldn't create a 20cm gaussian peak. (For example distance or rms of hits included in average?) ==>Xin I don't quite understand, distance or rms of hits to what? Is there some cut you tuned when you accepting hit to form average? ==> Xin Yes. We have already removed the hits with non-physical timing information (out of trigger timing window, but for sure with 25ns resolution). And we always take the earliest hit. (for exapmple maximum time difference?) Is average weighted by ToT? ==> Xin No. Supposedly the ToT dependence is calibrated. We just do simple average. Have you tried taking earliest hit only? ==> Xin Yes. We will trying to see what the pp data look like. Thanks /xin
plots
The following deficit of the CVS version of PPV have been corrected for in December of 2008
Problem: For W-events there may be less then 2 tracks in the eta range [-1,+1.4] to satisfy the above requirement. Although recent change in PPV causes additional 5 sub-prime vertices are saved with negative rank, it does not guarantee the vertex containing just a single 20+ GeV track will beat other minBias vertices from the pileup and make to this top 5.
Remedy: Extend criteria for valid primary vertex and save also those which contain at least one track with pT>15 GeV matched to BTOW or ETOW tower with ADC>=MIP. I do not want to impose ADC>1000 cut, because reco high pT electron track may miss the hit-tower and point to the neighbor one. There will be very few events with such high pT tracks so on average # of vertices per event will not change.
Implementation: PPV 'natural ranking' for pp events has dynamic range of [0+eps ... 10,000].
Consequences: The 1-track vertex will be listed after any 2+ track vertex. However, in none 2-track vertex is found the 1-track vertex will be firts on the list with positive rank. I maintain people should not use the top rank PPV vertex blindly but QA prime vertices according to their needs.
Modified PPV produces vertices with the following rank distribution:
Fig 1. Example of PPV vertex rank distribution for 200 W-events generated by Pythia. Top plot show vertex rank, X-axis range [-1.2e6, +1.2e6]. The 3 groups of vertices are those with 2 or more matched tracks (most right), 1-track vertices (12 events in the middle), sub-rime vertices (negative rank). Bottom plots shows the same events but Log10(rank) is used on the X-axis to make this 3 categories better visible.
Fig 2. Example of PPV vertex rank distribution for 200 st_physics 2008 pp events.
No change needed: PPV is tagging BTOW tower as fired if ADC>8. This item is here because I misremembered sth about PPV code, the plot is correct and nice so I live it for the record.
04 : MC study on PPV with BTOF in Run 9 geometry (Xin blog)
Study of vertex reconstruction in transverse X-Y plane, pp 500 data from 2009, high PT events from W-stream
Large variance in the initial determination of beam line constrain for pp 500 data has been observed. The concern was that reconstruction accuracy for high PT TPC electron tracks from W decay may be not sufficient.
At first simpleminded idea of increasing minimal PT of used tracks and imposing high track multiplicity did not improve accuracy of vertex determination in the transverse plane.
Next we look at individual events passing the following selection criteria, passing through most likely primary tracks candidates from the pool of global tracks:
Fig 1. Typical spectra for some of the cut parameters for W-stream pp 500 events
Tracks passing selection are approximated by straight lines in the vicinity of DCA to X=Y=0 and shown in Fig 2. Z-axis range is always 6 cm, centered at the max likelihood of PPV.
The following encoding was added to plots:
*head of arrows indicates direction of the momentum vector
*size of the arrow is proportional to track PT, max PT for given set of tracks (event) is in the title of the left histograms
* thickens of the line is proportional to the weight of track in vertex (or beam line) determination, I used formula:
width= 3.* (0.15*0.15)/sig/sig; , where sig=sigYlocal from Sti .
(The last 2 conditions sometimes interfere, since the thicker line increases also the arrow size, but still plots should help us to gain intuition).
Fig 2, Projections of global tracks at most likely vertex location. One event per row, two projections: Y vs. X and Y vs. Z.
Stray tracks are most likely form pileup or from decays matched to fired EMC towers.
The width of arrows is proportional to likelihood the vertex is below it (~1/track error^2)
Attachments A,B show more real data events.
Attachments C,D show M-C Pythia QCD events with partonic pT>10 & 20 GeV, respectively. C has fixed vertex offset, D has varied vertex offset.
Conclusion:
*Very often one sees 2 jets what impedes determination of transverse vertex position on event by event basis, in particular if vertex finder is not returning non-diagonal covariance matrix element covXY (see last event in fig 2.)
* we will pursue alternative method of beam line determination by fitting its equation directly to preselected tracks from multiple events. We try to skip event by event vertex determination.
Stand alone 3D beam line fitter developed by Jan & Rosi in June 2009
Fig 2. Example of X0,Y0 fit for pp 500 data F10415, more in att. A)
Attachment A): slides vetting 3D beam-line fitting algo
Attachment B): document describing math used to compute 3D likelihood
Attachment C: Source code for fitting contains:
(July 9, 2009)
1) the threshold for pT of single-matched-track vertices was lowered from 15 GeV/c to 10 GeV/c.
The purpose if this change is to not loose W-events for the case when TPC calibration is approximate and error of reco PT of TPC track is sizable for tracks with true PT of 20 GeV/c.
Those vertices will be now more likely pileup contaminated, since there is a fair chance for a random matching of a global track to a fired BTWO tower. Users should use vertices with at least 2 matched tracks which will have rank>1e6.
2) Additional expert-only functionality was added to PPV , encapsulated in the new class Vertex3D.
If BFC is run in normal way, e.g. in production no new action is taken.
However if BFC option "VtxSeedCalG" is added for every event high quality most likely primary tracks candidates from the pool of global tracks:
and printed in to the logfile in the format:
printf("track4beamLine %f %f %f %f %f %f %f %f %f %d %f %.1f %d \n",x,y,z,px,py,pz,er->_cYY,er->_cZY,er->_cZZ , tr.nFitPoint,tr.gChi2,z0,eveID);
Fig. 1. prim tracks candidates in the vicinity of beam line
Fig. 2. QA plots for prim track selection
Minuit VF | KFV | PPV | PPV + fitter |
---|---|---|---|
Num Vertices | |||
Num Tracks per event | |||
Num Tracks per vertex | |||
Vertex X | |||
Vertex Y | |||
Vertex Z | |||
Vertex Error X | |||
Vertex Error Y | |||
Vertex Error Z |
Comparison of total error magnitude of reconstructed vertex position. The data are from a Run13 W simulation without pile-up. The vertex is reconstructed without (left) and with (right) proposed PPV fitter
PPV as is | PPV w/ fitter |
---|---|
PPV | KFV | Minuit |
---|---|---|
Welcome to the Simulation Pages!
Please note that most of the material posted before October 2006 is located at the older web site which we'll keep for reference, for the time being. See sections below for most recent additions and information.
For making a new simulation production request, please consult the STAR Simulations Requests interface.
Users wishing to develop and integrate new detector models into the STAR framework will be intersted in the following links:
Tracking Interface (Stv)
Exporting detector hits
Implementing a custom seed finder
ID Truth
ID truth is an ID which enables us to determine which simulated particle was principally responsible for the creation of a hit in a detector, and eventually the physics objects (clusters, points, tracks) which are formed from them. The StHit class has a member function which takes two arguements:
Interface to Starsim
The interface between starsim and reconstruction is briefly outlined here
Information about geometries used in production and which geometries to use in simulations may be found in the following links:
Comparisons between the original AgSTAR model and the new AgML model of the detector may be found here:
AgML Project Overview and Readiness for 2012
HOWTO Use Geometries defined in AgML in STARSIM
AgML geometries are available for use in simulation using the "eval" libraries.
$ starver eval
The geometries themselves are available in a special library, which is setup for backwards compatability with starsim. To use the geometries you load the "xgeometry.so" library in a starsim session, either interactively or in a macro:
starsim> detp geom y2012
starsim> gexe $STAR_LIB/xgeometry.so
starsim> gclos all
HOWTO Use Geometries defined in AgML in the Big Full Chain
AgML geometries may also be used in reconstruction. To access them, the "agml" flag should be provided in the chain being run:
e.g
root [0] .L bfc.C
root [1] bfc(nevents,"y2012 agml ...", inputFile);
Geometry in Preparation: y2012Major changes: 1. Support cone, ftpc, ssd, pmd removed.
2. Inner Detector Support Module (IDSM) added
3. Forward GEM Tracker (FGTD) added
Use of AgML geometries within starsim:
$ starver eval
$ starsim
starsim> detp geom y2012
starsim> gexe $STAR_LIB/xgeometry.so
starsim> gclos all
Use of AgML geometries within the big full chain:
$ root4star
root [0] .L bfc.C
root [1] bfc(0,"y2012 agml ...",inputFile);
|
Current (10/24/2011) configuration of the IDSM with FGT inside -- |
Getting started developing geometries for the STAR experiment with AgML.
Setting up your local environment
You need to checkout several directories and complie in this order:
$ cvs co StarVMC/Geometry $ cvs co StarVMC/StarGeometry $ cvs co StarVMC/xgeometry$ cvs co pams/geometry$ cons +StarVMC/Geometry $ cons
This will take a while to compile, during which time you can get a cup of coffee, or do your laundry, etc...
If you only want to visualize the STAR detector, you can checkout:
$ cvs co StarVMC/Geometry/macros
Once this is done you can visualize STAR geometries using the viewStarGeometry.C macro in AgML 1, and the loadAgML.C macro in AgML 2.0.
$ root.exeroot [0] .L StarVMC/Geometry/macros/viewStarGeometry.C root [1] nocache=true root [2] viewall=true root [3] viewStarGeometry("y2012")root [0] .L StarVMC/Geometry/macros/loadAgML.C root [1] loadAgML("y2016") root [2] TGeoVolume *cave = gGeoManager->FindVolumeFast("CAVE"); root [3] cave -> Draw("ogl"); // ogl uses open GL viewer
Tutorial #1 -- Creating and Placing Volumes
Start by firing up your favorite text editor... preferably something which does syntax highlighting and checking on XML documents. Edit the first tutorial geometries located in StarVMC/Geometry/TutrGeo ...
$ emacs StarVMC/Geometry/TutrGeo/TutrGeo1.xml
This module illustrates how to create a new detector module, how to create and place a simple volume, and how to create and place multiple copies of that volume. Next, we need to attach this module to a geometry model in order to visualize it. Geometry models (or "tags") are defined in the StarGeo.xml file.
$ emacs StarVMC/Geometry/StarGeo.xml
There is a simple geometry, which only defines the CAVE. It's the first geometry tag called "black hole". You can add your detector here...
xxx
$ root.exe
root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C root [1] nocache=true root [2] viewStarGeometry("test","TutrGeo1");
The "test" geometry tag is a very simple geometry, implementing only the wide angle hall and the cave. All detectors, beam pipes, magnets, etc... have been removed. The second arguement to viewStarGeometry specifies which geometry module(s) are to be built and added to the test geometry. In this case we add only TutrGeo1. (A comma-separated list of geometry modules could be provided, if more than one geometry module was to be built).
Now you can try modifying TutrGeo1. Feel free to add as many boxes in as many positions as you would like. Once you have done this, recompile in two steps
$ cons +StarVMC/Geometry $ cons
Tutorial #2 -- A few simple shapes, rotations and reflections
The second tutorial geometry is in StarVMC/Geometry/TutrGeo/TutrGeo2.xml. Again, view it using viewStarGeometry.C
$ root.exe root [0] .L viewStarGeometry.C root [1] nocache=true root [2] viewStarGeometry("test","TutrGeo2")
What does the nocache=true statement do? It instructs viewStarGeometry.C to recreate the geometry, rather than load it from a root file created the last time you ran the geometry. By default, if the macro finds a file name "test.root", it will load the geometry from that file to save time. You don't want this since you know that you've changed the geometry.
The second tutorial illustrates a couple more simple shapes: cones and tubes. It also illustrates how to create reflections. Play around with the code a bit, recompile in the normal manner, then try viewing the geometry again.
Tutorial #3 -- Variables and Structures
AgML provides variables and structures. The third tutorial is in StarVMC/Geometry/TutrGeo/TutrGeo3.xml. Open this up in a text editor and let's look at it. We define three variables: boxDX, boxDY and boxDZ to hold the dimensions of the box we want to create. AgML is case-insensitve, so you can write this as boxdx, BoxDY and BOXDZ if you so choose. In general, choose what looks best and helps you keep track of the code you're writing. Next check out the volume "ABOX". Note how the shape's dx, dy and dz arguements now reference the variables boxDX, boxDY and boxDZ. This allows us to create multiple versions of the volume ABOX. Let's view the geometry and see. $ root.exe root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C root [1] nocache=true root [2] viewStarGeometry("test","TutrGeo3") Launch a new TBrowser and open the "test" geometry. Double click test --> Master Volume --> CAVE --> TUTR. You now see all of the concrete volumes which have been created by ROOT. It should look like what you see at the right. We have "ABOX", but we also have ABO1 and ABO2. This demonstrates the an important concept in AgML. Each <Volume ...> block actually defines a volume "factory". It allows you to create multiple versions of a volume, each differing by the shape of the volume. When the shape is changed, a new volume is created with a nickname, where the last letter in the volume name is replaced by [1 2 3 ... 0 a b c ... z] (then the second to last letter, then the third...). Structures provide an alternate means to define variables. In order to populate the members of a structure with values, you use the Fill statement. Multiple fill statements for a given structure may be defined, providing multiple sets of values. In order to select a given set of values, the <Use ...> operator is invoked. In TutrGeo3, we create and place 5 different tubes, using the data stored in the Fill statements. However, you might notice in the browser that there are only two concrete instances of the tube being created. What is going on here? This is another feature of AgML. When the shape is changed, AgML will look for another concrete volume with exactly the same shape. If it finds it, it will use that volume. If it doesn't, then a new volume is created. There's alot going on in this tutorial, so play around a bit with it. |
Tutorial #4 -- Some more shapes
Abstract: We compare the AgML and AgSTAR descriptions of recent revisions of the STAR Y2005 through Y2011 geometry models. We are specifically interested in the suitability of the AgML model for tracking. We therefore plot the material contained in the TPC vs pseudorapidity for (a) all detectors, (b) the time projection chamber, and (c) the sensitive volumes of the time projection chamber. We also plot (d) the material found in front of the TPC active volumes.
Decription of the PlotsBelow you will find four columns of plots, for the highest revision of each geometry from y2005 to the present. The columns from left-to-right show comparisons of the material budget for STAR and its daughter volumes, the material budgets for the TPC and it's immediate daughter volumes, the material budgets for the active volumes in the TPC, and the material in front of the active volume of the TPC. In the context of tracking, the right-most column is the most important. Each column contains three plots. The top plot shows the material budget in the AgML model. The middle plot, the material budget in the AgSTAR model. The bottom plot shows the difference divided by the AgSTAR model. The y-axis on the difference plot extends between -2.5% and +2.5%. --------------------------------
STAR Y2011 Geometry TagIssues with TpceGeo3a.xml
Issues with PhmdGeo.xml
|
|||
(a) Material in STAR Detector and daughters | (b) Material in TPC and daughters | (c) Material in TPC active volumes | (d) Material in front of TPC active volumes |
STAR Y2010c Geometry TagIssues with TpceGeo3a.xml
Issues with PhmdGeo.xml
|
|||
(a) Material in STAR Detector and daughters | (b) Material in TPC and daughters | (c) Material in TPC active volumes | (d) Material in front of TPC active volumes |
STAR Y2009c Geometry TagIssues with TpceGeo3a.xml
Issues with PhmdGeo.xml
|
|||
(a) Material in STAR Detector and daughters | (b) Material in TPC and daughters | (c) Material in TPC active volumes | (d) Material in front of TPC active volumes |
STAR Y2008e Geometry TagGlobal Issues
Issues with TpceGeo3a.xml
Issues with PhmdGeo.xml
|
|||
(a) Material in STAR Detector and daughters | (b) Material in TPC and daughters | (c) Material in TPC active volumes | (d) Material in front of TPC active volumes |
STAR Y2007h Geometry TagGlobal Issues
Issues with TpceGeo3a.xml
Issues with PhmdGeo.xml
Issues with SVT. |
|||
(a) Material in STAR Detector and daughters | (b) Material in TPC and daughters | (c) Material in TPC active volumes | (d) Material in front of TPC active volumes |
STAR Y2006g Geometry TagGlobal Issues
Note: TpceGeo2.xml does not suffer from the overlap issue in TpceGeo3a.xml |
|||
(a) Material in STAR Detector and daughters | (b) Material in TPC and daughters | (c) Material in TPC active volumes | (d) Material in front of TPC active volumes |
STAR Y2005i Geometry TagGlobal Issues
Issues with TpceGeo3a.xml
Issues with PhmdGeo.xml
|
|||
(a) Material in STAR Detector and daughters | (b) Material in TPC and daughters | (c) Material in TPC active volumes | (d) Material in front of TPC active volumes |
Attached is a comparison of track reconstruction using the Sti tracker, with AgI and AgML geometries as input.
As STAR gradually comes to the end of its AA heavy ion program and more focus are put on polarized pp/pA physics and future ep/eA project at eRHIC era, many upgrades are foreseen to strengthen the detector capability at forward region. These include both the near-term upgrades for polarized pp program, eg. FMS/FSC/FHC calorimeters and FGT/VFGT tracking, and upgrades for eSTAR in about 5 to 10 years. Different detector concepts exist and optimization is needed to fit them in STAR physics and into the current STAR detector system. To reach a proper solution a lot of Monte Carlo (MC) works will be carried out, especially in the STAR simulation framework for its flexibility, robustness and proven performance during the last decade.
List of default AgML materials and mixtures. To get a complete list of all materials defined in a geometry, execute AgMaterial::List() in ROOT, once the geometry has been created.
[-] Hydrogen: a= 1.01 z= 1 dens= 0.071 radl= 865 absl= 790 isvol= <unset> nelem= 1 [-] Deuterium: a= 2.01 z= 1 dens= 0.162 radl= 757 absl= 342 isvol= <unset> nelem= 1 [-] Helium: a= 4 z= 2 dens= 0.125 radl= 755 absl= 478 isvol= <unset> nelem= 1 [-] Lithium: a= 6.94 z= 3 dens= 0.534 radl= 155 absl= 121 isvol= <unset> nelem= 1 [-] Berillium: a= 9.01 z= 4 dens= 1.848 radl= 35.3 absl= 36.7 isvol= <unset> nelem= 1 [-] Carbon: a= 12.01 z= 6 dens= 2.265 radl= 18.8 absl= 49.9 isvol= <unset> nelem= 1 [-] Nitrogen: a= 14.01 z= 7 dens= 0.808 radl= 44.5 absl= 99.4 isvol= <unset> nelem= 1 [-] Neon: a= 20.18 z= 10 dens= 1.207 radl= 24 absl= 74.9 isvol= <unset> nelem= 1 [-] Aluminium: a= 26.98 z= 13 dens= 2.7 radl= 8.9 absl= 37.2 isvol= <unset> nelem= 1 [-] Iron: a= 55.85 z= 26 dens= 7.87 radl= 1.76 absl= 17.1 isvol= <unset> nelem= 1 [-] Copper: a= 63.54 z= 29 dens= 8.96 radl= 1.43 absl= 14.8 isvol= <unset> nelem= 1 [-] Tungsten: a= 183.85 z= 74 dens= 19.3 radl= 0.35 absl= 10.3 isvol= <unset> nelem= 1 [-] Lead: a= 207.19 z= 82 dens= 11.35 radl= 0.56 absl= 18.5 isvol= <unset> nelem= 1 [-] Uranium: a= 238.03 z= 92 dens= 18.95 radl= 0.32 absl= 12 isvol= <unset> nelem= 1 [-] Air: a= 14.61 z= 7.3 dens= 0.001205 radl= 30400 absl= 67500 isvol= <unset> nelem= 1 [-] Vacuum: a= 14.61 z= 7.3 dens= 1e-06 radl= 3.04e+07 absl= 6.75e+07 isvol= <unset> nelem= 1 [-] Silicon: a= 28.09 z= 14 dens= 2.33 radl= 9.36 absl= 45.5 isvol= <unset> nelem= 1 [-] Argon_gas: a= 39.95 z= 18 dens= 0.002 radl= 11800 absl= 70700 isvol= <unset> nelem= 1 [-] Nitrogen_gas: a= 14.01 z= 7 dens= 0.001 radl= 32600 absl= 75400 isvol= <unset> nelem= 1 [-] Oxygen_gas: a= 16 z= 8 dens= 0.001 radl= 23900 absl= 67500 isvol= <unset> nelem= 1 [-] Polystyren: a= 11.153 z= 5.615 dens= 1.032 radl= <unset> absl= <unset> isvol= <unset> nelem= 2 A Z W C 12.000 6.000 0.923 H 1.000 1.000 0.077 [-] Polyethylene: a= 10.427 z= 5.285 dens= 0.93 radl= <unset> absl= <unset> isvol= <unset> nelem= 2 A Z W C 12.000 6.000 0.857 H 1.000 1.000 0.143 [-] Mylar: a= 12.87 z= 6.456 dens= 1.39 radl= <unset> absl= <unset> isvol= <unset> nelem= 3 A Z W C 12.000 6.000 0.625 H 1.000 1.000 0.042 O 16.000 8.000 0.333
This page was merged with STAR Geometry in simulation & reconstruction and maintained by STAR's librarian.
Retired Simulation Pages kept here.
Immediate action items:
Documentation for the beampipe support geometry description development
After the completion of the 2007 run, the SVT and the SSD were removed from the STAR detector along with there utility lines. The support structure for the beampipe remained, however.
The following drawings describe the structure of the beampipe support as it exists in the late 2007 and probably throughout 2008
Here we present information about our datasets.
Description |
Dataset name
|
Statistics, thousands
|
Status
|
Moved to HPSS
|
Comment
|
---|---|---|---|---|---|
Herwig 6.507, Y2004Y |
rcf1259
|
225
|
Finished
|
Yes
|
7Gev<Pt<9Gev |
Herwig 6.507, Y2004Y |
rcf1258
|
248
|
Finished
|
Yes
|
5Gev<Pt<7Gev |
Herwig 6.507, Y2004Y |
rcf1257
|
367
|
Finished
|
Yes
|
4Gev<Pt<5Gev |
Herwig 6.507, Y2004Y |
rcf1256
|
424
|
Finished
|
Yes
|
3Gev<Pt<4Gev |
Herwig 6.507, Y2004Y |
rcf1255
|
407
|
Finished
|
Yes
|
2Gev<Pt<3Gev |
Herwig 6.507, Y2004Y |
rcf1254
|
225
|
Finished
|
Yes
|
35Gev<Pt<100Gev |
Herwig 6.507, Y2004Y |
rcf1253
|
263
|
Finished
|
Yes
|
25Gev<Pt<35Gev |
Herwig 6.507, Y2004Y |
rcf1252
|
263
|
Finished
|
Yes
|
15Gev<Pt<25Gev |
Herwig 6.507, Y2004Y |
rcf1251
|
225
|
Finished
|
Yes
|
11Gev<Pt<15Gev |
Herwig 6.507, Y2004Y |
rcf1250
|
300
|
Finished
|
Yes
|
9Gev<Pt<11Gev |
Hijing 1.382 AuAu 200 GeV minbias, 0< b < 20fm |
rcf1249
|
24
|
Finished
|
Yes
|
Tracking,new SVT geo, diamond: 60, +-30cm, Y2005D |
Herwig 6.507, Y2004Y |
rcf1248
|
15
|
Finished
|
Yes
|
35Gev<Pt<45Gev |
Herwig 6.507, Y2004Y |
rcf1247
|
25
|
Finished
|
Yes
|
25Gev<Pt<35Gev |
Herwig 6.507, Y2004Y |
rcf1246
|
50
|
Finished
|
Yes
|
15Gev<Pt<25Gev |
Herwig 6.507, Y2004Y |
rcf1245
|
100
|
Finished
|
Yes
|
11Gev<Pt<15Gev |
Herwig 6.507, Y2004Y |
rcf1244
|
200
|
Finished
|
Yes
|
9Gev<Pt<11Gev |
CuCu 62.4 Gev, Y2005C |
rcf1243
|
5
|
Finished
|
No
|
same as 1242+ keep Low Energy Tracks |
CuCu 62.4 Gev, Y2005C |
rcf1242
|
5
|
Finished
|
No
|
SVT tracking test, 10 keV e/m process cut (cf. rcf1237) |
10 J/Psi, Y2005X, SVT out
|
rcf1241
|
30
|
Finished
|
No
|
Study of the SVT material
effect
|
10 J/Psi, Y2005X, SVT in
|
rcf1240
|
30
|
Finished
|
No
|
Study of the SVT material
effect
|
100 pi0, Y2005X, SVT out
|
rcf1239
|
18
|
Finished
|
No
|
Study of the SVT material
effect
|
100 pi0, Y2005X, SVT in
|
rcf1238
|
20
|
Finished
|
No
|
Study of the SVT material
effect
|
CuCu 62.4 Gev, Y2005C |
rcf1237
|
5
|
Finished
|
No
|
SVT tracking test, pilot run |
Herwig 6.507, Y2004Y |
rcf1236
|
8
|
Finished
|
No
|
Test run for initial comparison with Pythia, 5Gev<Pt<7Gev |
Pythia, Y2004Y |
rcf1235
|
100
|
Finished
|
No
|
MSEL=2, min bias |
Pythia, Y2004Y |
rcf1234
|
90
|
Finished
|
No
|
MSEL=0,CKIN(3)=0,MSUB=91,92,93,94,95 |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1233
|
308
|
Finished
|
Yes
|
4<Pt<5, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1232
|
400
|
Finished
|
Yes
|
3<Pt<4, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1231
|
504
|
Finished
|
Yes
|
2<Pt<3, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1230
|
104
|
Finished
|
Yes
|
35<Pt, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1229
|
208
|
Finished
|
Yes
|
25<Pt<35, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1228
|
216
|
Finished
|
Yes
|
15<Pt<25, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1227
|
216
|
Finished
|
Yes
|
11<Pt<15, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1226
|
216
|
Finished
|
Yes
|
9<Pt<11, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1225
|
216
|
Finished
|
Yes
|
7<Pt<9, MSEL=1, GHEISHA |
Pythia, Y2004Y, sp.2 (CDF tune A) |
rcf1224
|
216
|
Finished
|
Yes
|
5<Pt<7, MSEL=1, GHEISHA |
Pythia special tune2 Y2004Y, GCALOR |
rcf1223
|
100
|
Finished
|
Yes
|
4<Pt<5, GCALOR
|
Pythia special tune2 Y2004Y, GHEISHA |
rcf1222
|
100
|
Finished
|
Yes
|
4<Pt<5, GHEISHA
|
Pythia special run 3 Y2004C |
rcf1221
|
100
|
Finished
|
Yes
|
ENER 200.0, MSEL 2, MSTP (51)=7, MSTP (81)=1, MSTP (82)=1, PARP (82)=1.9, PARP (83)=0.5, PARP (84)=0.2, PARP (85)=0.33, PARP (86)=0.66, PARP (89)=1000, PARP (90)=0.16, PARP (91)=1.0, PARP (67)=1.0 |
Pythia special run 2 Y2004C (CDF tune A) |
rcf1220
|
100
|
Finished
|
Yes
|
ENER 200.0, MSEL 2, MSTP (51)=7, |
Pythia special run 1 Y2004C |
rcf1219
|
100
|
Finished
|
Yes
|
ENER 200.0, MSEL 2, MSTP (51)=7, MSTP (81)=1, MSTP (82)=1, PARP (82)=1.9, PARP (83)=0.5, PARP (84)=0.2, PARP (85)=0.33, PARP (86)=0.66, PARP (89)=1000, PARP (90)=0.16, PARP (91)=1.5, PARP (67)=1.0 |
Hijing 1.382 AuAu 200 GeV central 0< b < 3fm |
rcf1218
|
50
|
Finished
|
Yes
|
Statistics enhancement of rcf1209 with a smaller diamond: 60, +-30cm, Y2004a |
Hijing 1.382 CuCu 200 GeV minbias 0< b < 14 fm |
rcf1216
|
52
|
Finished
|
Yes
|
Geometry: Y2005x
|
Hijing 1.382 AuAu 200 GeV minbias 0< b < 20 fm |
rcf1215
|
100
|
Finished
|
Yes
|
Geometry: Y2004a, Special D decays |
Description | Dataset name | Statistics, thousands | Status | Moved to HPSS | Comment |
---|---|---|---|---|---|
AuAu 200 GeV central | rcf1289 | 1 | Finished | No | upgr06: Hijing, D0 and superposition |
AuAu 200 GeV central | rcf1288 | 0.8 | Finished | No | upgr11: Hijing, D0 and superposition |
AuAu 200 GeV min bias | rcf1287 | 5 | Finished | No | upgr11: Hijing, D0 and superposition |
AuAu 200 GeV central | rcf1286 | 1 | Finished | No | upgr10: Hijing, D0 and superposition |
AuAu 200 GeV min bias | rcf1285 | 6 | Finished | No | upgr10: Hijing, D0 and superposition |
AuAu 200 GeV central | rcf1284 | 1 | Finished | No | upgr09: Hijing, D0 and superposition |
AuAu 200 Gev min bias | rcf1283 | 6 | Finished | No | upgr09: Hijing, D0 and superposition |
AuAu 200 GeV min bias | rcf1282 | 38 | Finished | No | upgr06: Hijing, D0 and superposition |
AuAu 200 GeV min bias | rcf1281 | 38 | Finished | Yes | upgr08: Hijing, D0 and superposition |
AuAu 200 GeV min bias | rcf1280 | 38 | Finished | Yes | upgr01: Hijing, D0 and superposition |
AuAu 200 GeV min bias | rcf1279 | 38 | Finished | Yes | upgr07: Hijing, D0 and superposition |
Extension of 1276: D0 superposition | rcf1278 | 5 | Finished | No | upgr07: Z cut=+-300cm |
AuAu 200 GeV min bias | rcf1277 | 5 | Finished | No | upgr05: Z cut=+-300cm |
AuAu 200 GeV min bias | rcf1276 | 35 | Finished | No | upgr05: Hijing, D0 and superposition |
Pythia 200 GeV + HF | rcf1275 | 23*4 | Finished | No | J/Psi and Upsilon(1S,2S,3S) mix for embedding |
AuAu 200 GeV min bias | rcf1274 | 10 | Finished | No | upgr02 geo tag, |eta|<1.5 (tracking upgrade request) |
Pythia 200 GeV | rcf1273 | 600 | Finished | Yes | Pt <2 (Completing the rcf1224-1233 series) |
CuCu 200 GeV min bias+D0 mix | rcf1272 | 50+2*50*8 | Finished | Yes | Combinatorial boost of rcf1261, sigma: 60, +-30 |
Pythia 200 GeV | rcf1233 | 300 | Finished | Yes | 4< Pt <5 (rcf1233 extension) |
Pythia 200 GeV | pds1232 | 200 | Finished | Yes | 3< Pt <4 (rcf1232 clone) |
Pythia 200 GeV | pds1231 | 240 | Finished | Yes | 2< Pt <3 (rcf1231 clone) |
Pythia 200 GeV | rcf1229 | 200 | Finished | Yes | 25< Pt <35 (rcf1229 extension) |
Pythia 200 GeV | rcf1228 | 200 | Finished | Yes | 15< Pt <25 (rcf1228 extension) |
Pythia 200 GeV | rcf1227 | 208 | Finished | Yes | 11< Pt <15 (rcf1227 extension) |
Pythia 200 GeV | rcf1226 | 200 | Finished | Yes | 9< Pt <11 (rcf1226 extension) |
Pythia 200 GeV | rcf1225 | 200 | Finished | Yes | 7< Pt <9 (rcf1225 extension) |
Pythia 200 GeV | rcf1224 | 212 | Finished | Yes | 5< Pt <7 (rcf1224 extension) |
Pythia 200 GeV Y2004Y CDF_A | rcf1271 | 120 | Finished | Yes | 55< Pt <65 |
Pythia 200 GeV Y2004A CDF_A | rcf1270 | 120 | Finished | Yes | 45< Pt <55 |
CuCu 200 GeV min bias | rcf1266 | 10 | Finished | Yes | SVT study: clams and two ladders |
CuCu 200 GeV min bias | rcf1265 | 10 | Finished | Yes | SVT study: clams displaced |
CuCu 200 GeV min bias | rcf1264 | 10 | Finished | Yes | SVT study: rotation of the barrel |
CuCu 62.4 GeV min bias+D0 mix | rcf1262 | 50*3 | Finished | Yes | 3 subsets: Hijing, single D0, and the mix |
CuCu 200 GeV min bias+D0 mix | rcf1261 | 50*3 | Finished | No | 3 subsets: Hijing, single D0, and the mix |
1 J/Psi over 200GeV minbias AuAu | rcf1260 | 10 | Finished | No | J/Psi mixed with 200GeV AuAu Hijing Y2004Y 60/35 vertex |
Name
System/Energy
Statistics
Status
HPSS
Comment
Site
rcf1290
AuAu200 0<b<3fm, Zcut=5cm
32*5
Done
Yes
Hijing+D0+Lac2+D0_mix+Lac2_mix
rcas
rcf1291
pp200/UPGR07/Zcut=10cm
10
Done
Yes
ISUB = 11, 12, 13, 28, 53, 68
rcas
rcf1292
pp500/UPGR07/Zcut=10cm
10
Done
Yes
ISUB = 11, 12, 13, 28, 53, 68
rcas
rcf1293
pp200/UPGR07/Zcut=30cm
205
Done
Yes
ISUB = 11, 12, 13, 28, 53, 68
rcas
rcf1294
pp500/UPGR07/Zcut=30cm
10
Done
Yes
ISUB = 11, 12, 13, 28, 53, 68
rcas
rcf1295
AuAu200 0<b<20fm, Zcut=30cm
20
Done
Yes
QA run for the Y2007 tag
rcas
rcf1296
AuAu200 0<b<3fm, Zcut=10cm
100*5
Done
Yes
Hijing,B0,B+,B0_mix,B+_mix, Y2007
rcas
rcf1297
AuAu200 0<b<20fm, Zcut=300cm
40
Done
Yes
Pile-up simulation in the TUP studies, UPGR13
rcas
rcf1298
AuAu200 0<b<3fm, Zcut=15cm
100*5
Done
Part
Hijing,D0,Lac2,D0_mix,Lac2_mix, UPGR13
rcas
rcf1299
pp200/Y2005/Zcut=50cm
800
Done
Yes
Pythia, photon mix, pi0 mix
rcas
rcf1300
pp200/UPGR13/Zcut=15cm
100
Done
No
Pythia, MSEL=4 (charm)
rcas
rcf1301
pp200/UPGR13/Zcut=300cm
84
Done
No
Pythia, MSEL=1, wide vertex
rcas
rcf1302
pp200 Y2006C
120
Done
No
Pythia for Spin PWG, Pt(45,55)GeV
rcas
rcf1303
pp200 Y2006C
120
Done
No
Pythia for Spin PWG, Pt(35,45)GeV
rcas
rcf1304
pp200 Y2006C
120
Done
No
Pythia for Spin PWG, Pt(55,65)GeV
rcas
rcf1296
Upsilon S1,S2,S3 + Hijing
15*3
Done
No
Muon Telescope Detector, ext.of 1296
rcas
rcf1306
pp200 Y2006C
400
Done
Yes
Pythia for Spin PWG, Pt(25,35)GeV
rcas
rcf1307
pp200 Y2006C
400
Done
Yes
Pythia for Spin PWG, Pt(15,25)GeV
rcas
rcf1308
pp200 Y2006C
420
Done
Yes
Pythia for Spin PWG, Pt(11,15)GeV
rcas
rcf1309
pp200 Y2006C
420
Done
Yes
Pythia for Spin PWG, Pt(9,11)GeV
rcas
rcf1310
pp200 Y2006C
420
Done
Yes
Pythia for Spin PWG, Pt(7,9)GeV
rcas
rcf1311
pp200 Y2006C
400
Done
Yes
Pythia for Spin PWG, Pt(5,7)GeV
rcas
rcf1312
pp200 Y2004Y
544
Done
No
Di-jet CKIN(3,4,7,8,27,28)=7,9,0.0,1.0,-0.4,0.4
rcas
rcf1313
pp200 Y2004Y
760
Done
No
Di-jet CKIN(3,4,7,8,27,28)=9,11,-0.4,1.4,-0.5,0.6
rcas
rcf1314
pp200 Y2004Y
112
Done
No
Di-jet CKIN(3,4,7,8,27,28)=11,15,-0.2,1.2,-0.6,-0.3
Grid
rcf1315
pp200 Y2004Y
396
Done
No
Di-jet CKIN(3,4,7,8,27,28)=11,15,-0.5,1.5,-0.3,0.4
Grid
rcf1316
pp200 Y2004Y
132
Done
No
Di-jet CKIN(3,4,7,8,27,28)=11,15,0.0,1.0,0.4,0.7
Grid
rcf1317
pp200 Y2006C
600
Done
Yes
Pythia for Spin PWG, Pt(4,5)GeV
Grid
rcf1318
pp200 Y2006C
690
Done
Yes
Pythia for Spin PWG, Pt(3,4)GeV
Grid
rcf1319
pp200 Y2006C
690
Done
Yes
Pythia for Spin PWG, Minbias
Grid
rcf1320
pp62.4 Y2006C
400
Done
No
Pythia for Spin PWG, Pt(4,5)GeV
Grid
rcf1321
pp62.4 Y2006C
250
Done
No
Pythia for Spin PWG, Pt(3,4)GeV
Grid
rcf1322
pp62.4 Y2006C
220
Done
No
Pythia for Spin PWG, Pt(5,7)GeV
Grid
rcf1323
pp62.4 Y2006C
220
Done
No
Pythia for Spin PWG, Pt(7,9)GeV
Grid
rcf1324
pp62.4 Y2006C
220
Done
No
Pythia for Spin PWG, Pt(9,11)GeV
Grid
rcf1325
pp62.4 Y2006C
220
Done
No
Pythia for Spin PWG, Pt(11,15)GeV
Grid
rcf1326
pp62.4 Y2006C
200
Running
No
Pythia for Spin PWG, Pt(15,25)GeV
Grid
rcf1327
pp62.4 Y2006C
200
Running
No
Pythia for Spin PWG, Pt(25,35)GeV
Grid
rcf1328
pp62.4 Y2006C
50
Running
No
Pythia for Spin PWG, Pt(35,45)GeV
Grid
Name | SystemEnergy |
Range | Statistics | Comment |
rcf9001 | pp200, y2007g | 03_04gev | 690k | Jet Study AuAu200(PP200) JLC PWG |
rcf9002 | 04_05gev | 686k | ||
rcf9003 | 05_07gev | 398k | ||
rcf9004 | 07_09gev | 420k | ||
rcf9005 | 09_11gev | 412k | ||
rcf9006 | 11_15gev | 420k | ||
rcf9007 | 15_25gev | 397k | ||
rcf9008 | 25_35gev | 400k | ||
rcf9009 | 35_45gev | 120k | ||
rcf9010 | 45_55gev | 118k | ||
rcf9011 | 55_65gev | 120k | ||
Name | SystemEnergy | Range | Statistics | Comment |
rcf9021 | pp200,y2008 | 03_04 GeV | 690k | Jet Study AuD200(PP200) JLC PWG |
rcf9022 | 04_05 GeV | 686k | ||
rcf9023 | 05_07 GeV | 398k | ||
rcf9024 | 07_09 GeV | 420k | ||
rcf9025 | 09_11 GeV | 412k | ||
rcf9026 | 11_15 GeV | 420k | ||
rcf9027 | 15_25 GeV | 397k | ||
rcf9028 | 25_35 GeV | 400k | ||
rcf9029 | 35_45 GeV | 120k | ||
rcf9030 | 45_55 GeV | 118k | ||
rcf9031 | 55_99 GeV | 120k |
Name | SystemEnergy | Range | Statistics | Comment |
rcf9041 | PP500, Y2009 | 03_04gev | 500k | Spin Study PP500 Spin group(Matt,Jim,Jan) 2.3M evts |
rcf9042 | 04_05gev | 500k | ||
rcf9043 | 05_07gev | 300k | ||
rcf9044 | 07_09gev | 250k | ||
rcf9045 | 09_11gev | 200k | ||
rcf9046 | 11_15gev | 100k | ||
rcf9047 | 15_25gev | 100k | ||
rcf9048 | 25_35gev | 100k | ||
rcf9049 | 35_45gev | 100k | ||
rcf9050 | 45_55gev | 25k | ||
rcf9051 | 55_99gev | 25k | ||
rcf9061 | CuCu200,y2005h | B0_14 | 200k | CuCu200 radiation length budget, Y.Fisyak, KyungEon Choi. |
rcf9062 | AuAu200, y2007h | B0_14 | 150k | AuAu200 radiation length budget Y.Fisyak ,KyungEon Choi |
Geometry | y2009a |
Library | SL09g |
Generator | Pythia 6.4.22 |
Tune | 320 |
Field | -5.0 |
ETA | -10 < η < +10 |
PHI | -π < φ < +π |
vertex | 0, 0, -2 |
width | 0.015, 0.015, 42.0 |
Sample | Channel | Events |
rcf10000 | W+ → e+ nu | 10k |
rcf10001 | W- → e- nu | 6k |
rcf10002 | W+ → tau+ nu W- → tau- nu |
10k |
rcf10003 | pp → W+/- + jet | 10k |
rcf10004 | Z e+e-, no Z/gamma interference | 4k |
rcf10005 | Z all but e+e- | 10k |
rcf10006 | QCD w/ partonic pT > 35 GeV | 100k |
This page documents the options in geometry.g which define each of the production tags.
This page documents the options in geometry.g which define each of the production tags.
This page documents the options in geometry.g which define each of the production tags.
This page documents the options in geometry.g which define each of the production tags.
This page documents the options in geometry.g which define each of the production tags.
This page documents the options in geometry.g which define each of the production tags.
The attached spreadsheets document the production tags in STARSIM on 11/30/2009. At that time the y2006h and y2010 tags were in development and not ready for production.
.
y2008a full and TPC only material histograms
1 | 2 |
.
2 | |
3 | |
.
111 | |
` |
1 | |
.
. | |
. | |
. | |
At the left is a general status for each geometry tag which compiles in AgML. All volumes are tested recursively except for the "IBEM" and similar support structures for the VPD, and the Endcap SMD strips. (The ESMD planes are tested as a unit, rather than test all 2*12*288 SMD strips).
Color codes:
Green: No differences larger than 1% Yellow: The volume did not appear in AgSTAR geometry Orange: Difference was larger than 1%, but absolute difference is absolutely negligible. Red: A difference larger than 1% was detected for a significant amount of material; or a negligible but widespread difference was detected.
At the right is a PDF file for each geometry tag. For each volume we show two plots. The top plot shows the absolute number of radiation lengths which a geantino encounters traversing the geometry, starting at the geometry and following a straight line at the given pseudorapidity. We average over all phi. The left (right) hashes show the AgML (AgSTAR) geometry. The difference (expressed as a fractional value) of the two histograms is shown the lower plot. Frequently the differences are small, e.g. 10^-6, and ROOT rescales the plots accordingly. Since it is difficult to read the scales of so many plots at once, we have color coded the plots. (Coding seems to fail in the generation of some histograms)... The meaning of the color coding is summarized below.
<?php
/********************************************************************** START OF PHP */
/* =======================================================
Helper function to show the status_yXXXX.png
======================================================= */
function showImage( $tag, $dir ) {
echo "<img src=\"$dir/status_$tag.png\" />";
}
/* =======================================================
Helper function to show the PDF file
======================================================= */
function showGoogle( $tag, $dir ) {
/*
echo "<iframe border=\"0\" url=\"http://docs.google.com/gview?url=$dir$tag.pdf&embedded=true\" style=\"width: 562px; height: 705px;\"> </iframe>";
*/
echo
"<iframe frameborder=\"0\" style=\"width: 562px; height: 705px;\" src=\"http://docs.google.com/gview?url=$dir/$tag.pdf&embedded=true\"></iframe>"
;
}
/* =======================================================
First some PHP input... find the date of the comparison
======================================================= */
$YEAR="2011";
$DATE="06-15-2011";
$DIR="http://www.star.bnl.gov/~jwebb/".$YEAR."/".$DATE."/AgML-Comparison/";
$TAGS=$DIR."TAGS";
/* =======================================================
Output header for this page
======================================================= */
echo "<h3>STAR AgML vs AgSTAR Comparison on ".$DATE."</h3>";
/* =======================================================
Read in each line in the TAGs file
======================================================= */
$handle = @fopen("$TAGS", "r");
if ($handle) {
while (($buffer = fgets($handle, 4096)) !== false) {
/* Trim the whitespace out of the string */
$buffer=trim($buffer);
/* Draw an HRULE and specify which geometry tag we are using */
echo "<hr><p>STAR Geometry Tag $buffer</p>";
/* Now build a 2-entry table with the status PNG on the left
and the summary PDF ala google docs on the right */
showImage( $buffer, $DIR );
showGoogle( $buffer, $DIR );
}
if (!feof($handle)) {
echo "Error: unexpected fgets() fail\n";
}
fclose($handle);
}
/************************************************************************ END OF PHP */
?>
The R&D conducted for the inner tracking upgrade required that a few specialized geometry tags be created. For a complete set of geometry tags, please visit the STAR Geometry in simulation & reconstruction page. The below serves as additional documentation and details.
Taxonomy:
The TPC is present in all configuration listed below and the SVT is in none.
Tag |
SSD | IST | HFT | IGT | HPD | Contact Person | Comment | |
---|---|---|---|---|---|---|---|---|
UPGR01 |
+ |
|
+ |
|
||||
UPGR02 |
|
+ |
+ |
|
||||
UPGR03 |
|
+ |
+ |
+ |
|
|||
|
+ |
|
|
+ |
Sevil |
retired | ||
|
+ |
+ |
+ |
+ |
+ |
Everybody |
retired | |
|
+ |
+ |
+ |
Sevil |
retired | |||
UPGR07 |
+ |
+ |
+ |
+ |
|
Maxim |
||
|
+ |
+ |
+ |
+ |
Maxim |
|||
|
+ |
+ |
+ |
Gerrit |
retired Outer IST layer only | |||
UPGR10 |
+ |
+ |
+ |
Gerrit |
Inner IST@9.5cm | |||
UPGR11 |
+ |
+ |
+ |
|
Gerrit |
IST @9.5&@17.0 | ||
|
+ |
+ |
+ |
+ |
+ |
Ross Corliss |
retired UPGR05*diff.igt.radii | |
UPGR13 |
+ |
+ |
+ |
+ |
Gerrit |
UPGR07*(new 6 disk FGT)*corrected SSD*(no West Cone) | ||
UPGR14 | + | + | + | Gerrit | UPGR13 - IST | |||
UPGR15 | + | + | + | Gerrit | Simple Geometry for testing, Single IST@14cm, hermetic/polygon Pixel/IST geometry. Only inner beam pipe 0.5mm Be. Pixel 300um Si, IST 1236umSi | |||
UPGR20 | + | Lijuan | Y2007 + one TOF | |||||
UPGR21 | + | Lijuan | UPGR20 + full TOF |
Eta coverage of the SSD and HFT at different vertex spreads:
Z cut, cm |
eta SSD | eta HFT |
---|---|---|
5 |
1.63 |
2.00 |
10 |
1.72 |
2.10 |
20 |
1.87 |
2.30 |
30 |
2.00 |
2.55 |
Material balance studies for the upgrade: presented below are the usual radiation length plots (as a function of rapidity).
Full UPGR05:
Forward region: the FST and the IGT ONLY:
Below, we plot the material for each individual detector, excluding the forward region to reduce ambiguity.
SSD:
IST:
HPD:
HFT:
The attached PDF describes event filtering in the STAR framework.
$ cvs co StRoot/StarGenerator/macros
$ ln -s StRoot/StarGenerator/macros/starsim.pythia8.C starsim.C $ root4star -q -b starsim.C\(100\)This will generate two files. A standard "fzd" file, which can be reconstructed using the big "full" chain (bfc.C). And a root file, containing a TTree expressing the event record for the generated events.
root [0] TFile::Open("pythia8.starsim.root") root [1] genevents->StartViewer() root [2] genevents->Draw("mMass","mStatus>0")The event record contains both particle-wise and event-wise information. For the definitions of different quantities, see the documentation provided in the StarGenEvent links above.
$ cvs co StRoot/StarGeneratorAfter checking out the generator area you will note that the code is organized into several directories, containing both CORE packages and concrete event generators. Specifically:
INTEGER --> Int_t REAL --> Float_t REAL *4 --> Float_t REAL *8 --> Double_t DOUBLE PRECISION --> Double_tYou probably noticed that there are two differences with the way we have declared the arrays. First, the arrays all were declared with an "_" in front of their name. This was a choice on my part, which I will explain in a moment. The important thing to notice right now is that the indicies on the arrays are reversed, compared to their declarion in FORtran. "INTEGER K(4000,5)" in FORtran becomes "Int_t _k[5][4000]" in C++. The reason for this is that C++ and FORtran represent arrays differently in memory. It is important to keep these differences in mind when mapping the memory of a FORtran common block --
Event generators currently integrated into starsim using the root4star framework (11/29/12):
To run, checkout StRoot/StarGenerator/macros and modify the appropriate example ROOT macro for your purposes.
Event generators currently implemented in the starsim framework (11/29/12):
Material budget in the Y2013 (X) geometry. The top left plots number of radiation lengths encounted by a straight track at the given eta, phi. The top right (bottom left) compares the ROOT and STARSIM geometries generated by AgML plotted vs phi (eta). These are averaged over the other variable. ROOT geometry in black, STARSIM in red. The bottom right shows the difference in ROOT - STARSIM geometries vs phi and eta. Less than 0.01 radiation lengths difference found integrated over the entire cave.
Attached are material budget plots and differences for major subsystems. Each PDF contains the material budget plots displaying number of radiation lengths averaged over all phi for the ROOT (left) and STARSIM (right) geometries created by AgML. The material difference plot is as described above.
This page has been created with the purpose to systematize the various scripts currently used in the Monte Carlo production and testing. The contents will be updated as needed, however the codes are presumed to be correct and working at any given time.
When running on rcas, we typically use a legacy csh script named "alljobs". It parses the job configuration file named "catalog" and dispatches a single job on the target node, which can be an interactive node if run interactively, or a batch node if submitted to a queue. The alljobs script expects to see the following directory structure: a writeable directory with the name of the dataset being produced, and directly under it, a writeable "log" directory, in which it would deposit the so-called token files, which serve two purposes:
The catalog file is effectively a table, in white-space separated format. Each line begins with the dataset name which is a three-letter acronym of the site name (and thus either rcf or pds) followed by a 4-digit serial number of the set. The alljobs script expects to find a directory named identically to the dataset, under the "job directory", which in the current version of the script is hardcoded as /star/simu/simu/gstardata. This, of course, can be improved or changed.
The last field in each entry is used to construct the so-called tag, which plays an important role: it effectively defined the location of the Monte Carlo data in the HPSS, when the data is sunk there (this is done by a separate script). In addition, it also defines keys for the entries in the FileCatalog (reconstructed data). The alljobs script creates a file of zero length, with a name which is a period-separated catenation of the word "tag" and the contents of the last column in the line.
Here are the contents of the catalog file as it existed from the late 1990-s to the end of 2006
rcf0101 auau200/nexus/default/central evgen.*.nt auau200.nexus.default.b0_3.year_1h.hadronic_on
rcf0105 auau200/nexus/default/minbias evgen.*.nt auau200.nexus.default.minbias.year_1h.hadronic_on
pds0101 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
pds0102 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
pds0103 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
pds0104 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
rcf0096 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
pds0105 auau200/mevsim/vanilla_trigger/central evgen.*.nt auau200.mevsim.vanilla.trigger.year_1h.hadronic_on
rcf0097 auau200/mevsim/vanilla_resonance/central evgen.*.nt auau200.mevsim.vanilla.resonance.year_1h.hadronic_on
rcf0098 auau200/mevsim/vanilla_trigger/central evgen.*.nt auau200.mevsim.vanilla.trigger.year_1h.hadronic_on
rcf0095 auau200/mevsim/vanilla_flow/central evgen.*.nt auau200.mevsim.vanilla.flow.year_1h.hadronic_on
rcf0099 auau200/mevsim/vanilla_fluct/central evgen.*.nt auau200.mevsim.vanilla.fluct.year_1h.hadronic_on
rcf0102 auau200/mevsim/vanilla/central evgen.*.nt auau200.mevsim.vanilla.central.year_1h.hadronic_on
rcf0103 auau200/mevsim/vanilla/central evgen.*.nt auau200.mevsim.vanilla.central.year_1h.hadronic_on
rcf0104 auau200/mevsim/vanilla_flow/central evgen.*.nt auau200.mevsim.vanilla.flow.year_1h.hadronic_on
rcf0100 auau200/mevsim/cascade/central evgen.*.nt auau200.mevsim.cascade.central.year_1h.hadronic_on
rcf0106 auau200/hbt/default/peripheral evgen.*.nt auau200.hbt.default.peripheral.year_1h.hadronic_on
rcf0107 auau200/hbt/default/midperipheral evgen.*.nt auau200.hbt.default.midperipheral.year_1h.hadronic_on
rcf0108 auau200/hbt/default/middle evgen.*.nt auau200.hbt.default.middle.year_1h.hadronic_on
rcf0109 auau200/hbt/default/midcentral evgen.*.nt auau200.hbt.default.midcentral.year_1h.hadronic_on
rcf0110 auau200/hbt/default/central evgen.*.nt auau200.hbt.default.central.year_1h.hadronic_on
rcf0111 none hijing.*.xdf auau200.hijing.b0_3_jetq_on.jet05.year_1h.hadronic_on
rcf0112 none hijing.*.xdf auau200.hijing.b0_3_jetq_off.jet05.year_1h.hadronic_on
rcf0113 none hijing.*.xdf auau200.hijing.b8_15_jetq_on.jet05.year_1h.hadronic_on
rcf0114 none hijing.*.xdf auau200.hijing.b8_15_jetq_off.jet05.year_1h.hadronic_on
rcf0115 none hijing.*.xdf auau200.hijing.b0_3_jetq_on.jet05.year_1h.hadronic_on
rcf0116 none hijing.*.xdf auau200.hijing.b0_3_jetq_off.jet05.year_1h.hadronic_on
rcf0117 none hijing.*.xdf auau200.hijing.b8_15_jetq_on.jet05.year_1h.hadronic_on
rcf0118 none hijing.*.xdf auau200.hijing.b8_15_jetq_off.jet05.year_1h.hadronic_on
rcf0119 none hijing.*.xdf pau200_hijing_b0_7_jet15_year_1h.hadronic_on
rcf0120 none hijing.*.xdf pau200_hijing_b0_7_gam15_year_1h_hadronic_on
rcf0121 pec/dtunuc dtu*.xdr auau200.dtunuc.two_photon.none.year_1h.hadronic_on
rcf0122 pec/starlight starlight_vmeson_*.t auau200.starlight.vmeson.none.year_1h.hadronic_on
rcf0123 pec/starlight starlight_2gamma_*.nt auau200.starlight.2gamma.none.year_1h.hadronic_on
rcf0124 pec/hemicosm events.txt auau200.hemicosm.default.none.year_1h.hadronic_on
rcf0125 pec/dtunuc dtu*.xdr auau200.dtunuc.two_photon.halffield.year_1h.hadronic_on
rcf0126 pec/starlight starlight_vmeson_*.t auau200.starlight.vmeson.halffield.year_1h.hadronic_on
rcf0127 pec/starlight starlight_2gamma_*.t auau200.starlight.2gamma.halffield.year_1h.hadronic_on
rcf0131 pec/beamgas venus.h.*.nt auau200.hijing.beamgas.hydrogen.year_1h.hadronic_on
rcf0132 pec/beamgas venus.n.*.nt auau200.hijing.beamgas.nitrogen.year_1h.hadronic_on
rcf0139 none hijev.inp auau128.hijing.b0_12.halffield.year_1e.hadronic_on
rcf0140 none hijev.inp auau128.hijing.b0_3.halffield.year_1e.hadronic_on
rcf0141 auau200/strongcp/broken/eb_400_90 evgen.*.nt auau200.strongcp.broken.eb-400_90.year_1h.hadronic_on
rcf0142 auau200/strongcp/broken/eb_400_00 evgen.*.nt auau200.strongcp.broken.eb-400_00.year_1h.hadronic_on
rcf0143 auau200/strongcp/broken/lr_eb_400_90 evgen.*.nt auau200.strongcp.broken.lr_eb_400_90.year_1h.hadronic_on
rcf0145 none hijev.inp auau130.hijing.b0_3.jet05.year_1h.halffield.hadronic_on
rcf0146 none hijev.inp auau130.hijing.b0_15.default.year_1h.halffield.hadronic_on
rcf0147 none hijev.inp auau130.hijing.b0_3.default.year_1e.halffield.hadronic_on
rcf0148 none hijev.inp auau130.hijing.b3_6.default.year_1e.halffield.hadronic_on
rcf0151 auau130/mevsim/vanilla_flow/central evgen.*.nt auau130.mevsim.vanilla_flow.central.year_1e.hadronic_on
rcf0152 auau130/mevsim/vanilla_trigger/central evgen.*.nt auau130.mevsim.vanilla_trigger.central.year_1e.hadronic_on
rcf0153 auau130/mevsim/vanilla_dynamic/central evgen.*.nt auau130.mevsim.vanilla_dynamic.central.year_1e.hadronic_on
rcf0154 auau130/mevsim/vanilla_omega/central evgen.*.nt auau130.mevsim.vanilla_omega.central.year_1e.hadronic_on
rcf0155 auau130/mevsim/vanilla/central evgen.*.nt auau130.mevsim.vanilla.central.year_1e.hadronic_on
rcf0156 auau130/nexus/default/central evgen.*.nt auau130.nexus.default.b0_3.year_1e.hadronic_on
rcf0159 rqmd auau_b0-14.*.cwn auau200.rqmd.default.b0_14.year_1h.hadronic_on
rcf0160 rqmd auau_b0-15.*.cwn auau200.rqmd.default.b0_15.year_1h.hadronic_on
rcf0161 auau130/mevsim/vanilla_flow/central evgen.*.nt auau130.mevsim.vanilla_flow.central.year_1h.hadronic_on
rcf0162 auau130/mevsim/vanilla_trigger/central evgen.*.nt auau130.mevsim.vanilla_trigger.central.year_1h.hadronic_on
rcf0163 auau130/mevsim/vanilla_dynamic/central evgen.*.nt auau130.mevsim.vanilla_dynamic.central.year_1h.hadronic_on
rcf0164 auau130/mevsim/vanilla_omega/central evgen.*.nt auau130.mevsim.vanilla_omega.central.year_1h.hadronic_on
rcf0165 auau130/mevsim/vanilla/central evgen.*.nt auau130.mevsim.vanilla.central.year_1h.hadronic_on
rcf0166 auau130/mevsim/vanilla_resonance/central evgen.*.nt auau130.mevsim.vanilla_resonance.central.year_1h.hadronic_on
pds0167 auau130/mevsim/vanilla_cocktail/central evgen.*.nt auau130.mevsim.vanilla_cocktail.central.year_1h.hadronic_on
rcf0168 auau130/mevsim/vanilla_flow/mbias evgen.*.nt auau130.mevsim.vanilla_flow.minbias.year_1h.hadronic_on
rcf0169 auau130/mevsim/vanilla_flowb/central evgen.*.nt auau130.mevsim.vanilla_flowb.central.year_1h.hadronic_on
rcf0171 auau130/mevsim/vanilla_lambda_antilambda/central evgen.*.nt auau130.mevsim.vanilla_both_lambda.central.year_1h.hadronic_on
rcf0172 auau130/mevsim/vanilla_lambda/central evgen.*.nt auau130.mevsim.vanilla_lambda.central.year_1h.hadronic_on
rcf0173 auau130/mevsim/vanilla_antilambda/central evgen.*.nt auau130.mevsim.vanilla_antilambda.central.year_1h.hadronic_on
rcf0181 auau200/mevsim/mdc4/central evgen.*.nt auau200.mevsim.mdc4_cocktail.central.year2001.hadronic_on
pds0182 auau200/mevsim/mdc4/central evgen.*.nt auau200.mevsim.mdc4_cocktail.central.year2001.hadronic_on
rcf0183 none hijev.inp auau200.hijing.b0_20.standard.year2001.hadronic_on
rcf0184 none hijev.inp auau200.hijing.b0_3.standard.year2001.hadronic_on
rcf0190 auau200/mevsim/mdc4_electrons evgen.*.nt auau200.mevsim.mdc4_electrons.year2001.hadronic_on
rcf0191 none hijev.inp auau200.hijing.b0_20.inverse.year2001.hadronic_on
rcf0192 none hijev.inp auau200.hijing.b0_3.inverse.year2001.hadronic_on
rcf0193 none hijev.inp dau200.hijing.b0_20.standard.year_2a.hadronic_on
# Maxim has arrived:
# the following two runs had the 1 6 setting for the hard scattering and energy
rcf0194 none hijev.inp dau200.hijing.b0_20.jet06.year2003.hadronic_on
pds0195 none hijev.inp dau200.hijing.b0_20.jet06.year2003.hadronic_on
# this one had 1 3
rcf0196 none hijev.inp dau200.hijing.b0_20.jet03.year2003.hadronic_on
# standard 0 2 setting
rcf0197 none hijev.inp dau200.hijing.b0_20.jet02.year2003.hadronic_on
# new numbering
rcf1197 none hijev.inp dau200.hijing.b0_20.minbias.year2003.hadronic_on
rcf1198 dau200/hijing_382/b0_20/minbias evgen.*.nt dau200.hijing_382.b0_20.minbias.year2003.gheisha_on
# dedicated wide Z run
rcf1199 dau200/hijing_382/b0_20/minbias evgen.*.nt dau200.hijing_382.b0_20.minbias_wideZ.year2003.hadronic_on
# Pythia
rcf1200 none pyth.dat pp200.pythia6_203.default.minbias.year2003.hadronic_on
# Heavy flavor embedding with full calorimeter
rcf1201 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2003x.gheisha_on
# Pythia hi Pt>5
rcf1202 none pyth.dat pp200.pythia6_203.default.pt5.year2003.gheisha_on
# Mevsim fitted to 200GeV AuAu
rcf1203 auau200/mevsim/v2/central_6 evgen.*.nt auau200.mevsim.v2.b0_6.year_1e.gheisha_on
# Mevsim fitted to 200GeV AuAu, different geo
rcf1204 auau200/mevsim/v2/central_6 evgen.*.nt auau200.mevsim.v2.b0_6.year2001.gheisha_on
# Pythia hi Pt>15
rcf1205 none pyth.dat pp200.pythia6_203.default.pt15.year2003.gheisha_on
# Starsim maiden voyage, with y2004, 62.4 GeV
rcf1206 auau62/hijing_382/b0_20/minbias evgen.*.nt auau62.hijing_382.b0_20.minbias.y2004.gheisha_on
# 62.4 GeV central
rcf1207 auau62/hijing_382/b0_3/central evgen.*.nt auau62.hijing_382.b0_3.central.y2004a.gheisha_on
pds1207 auau62/hijing_382/b0_3/central evgen.*.nt auau62.hijing_382.b0_3.central.y2004a.gheisha_on
# 200 GeV minbias
rcf1208 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2004a.gheisha_on
# 200 GeV central
rcf1209 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2004a.gheisha_on
# Pythia
rcf1210 none pyth.dat pp200.pythia6_203.default.minbias.y2004a.gheisha_on
# Pythia Spin group
rcf1211 none pyth.dat pp200.pythia6_203.default.minbias.y2004x.gheisha_on
# Pythia Spin group
pds1212 none pyth.dat pp200.pythia6_203.default.pt3.y2004x.gheisha_on
# Pythia Spin group
rcf1213 none pyth.dat pp200.pythia6_205.default.pt7.y2004x.gheisha_on
# Pythia Spin group
pds1214 none pyth.dat pp200.pythia6_203.default.pt15.y2004x.gheisha_on
# 200 GeV minbias special D decays
rcf1215 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.speciald.y2004a.gheisha_on
# 200 GeV minbias copper
rcf1216 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2005x.gheisha_on
# 200 GeV minbias copper test
rcf1217 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2004a.gheisha_on
# 200 GeV central reprise of 1209, smaller diamond
rcf1218 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2004a.gheisha_on
# Pythia Special 1
rcf1219 none pyth.dat pp200.pythia6_203.default.special1.y2004c.gheisha_on
# Pythia Special 2 (CDF A)
rcf1220 none pyth.dat pp200.pythia6_203.default.special2.y2004c.gheisha_on
# Pythia Special 3
rcf1221 none pyth.dat pp200.pythia6_203.default.special3.y2004c.gheisha_on
# Pythia Special 2 4<Pt<5 Gheisha
rcf1222 none pyth.dat pp200.pythia6_203.default.special2.y2004y.gheisha_on
# Pythia Special 2 4<Pt<5 GCALOR
rcf1223 none pyth.dat pp200.pythia6_203.default.special2.y2004y.gcalor_on
# Pythia Special 2 (CDF A) 5-7 GeV 6/28/05
rcf1224 none pyth.dat pp200.pythia6_205.5_7gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 7-9 GeV 6/28/05
rcf1225 none pyth.dat pp200.pythia6_205.7_9gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 9-11 GeV 6/28/05
rcf1226 none pyth.dat pp200.pythia6_205.9_11gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 11-15 GeV 6/28/05
rcf1227 none pyth.dat pp200.pythia6_205.11_15gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 15-25 GeV 6/29/05
rcf1228 none pyth.dat pp200.pythia6_205.15_25gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 25-35 GeV 6/29/05
rcf1229 none pyth.dat pp200.pythia6_205.25_35gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) > 35 GeV 6/29/05
rcf1230 none pyth.dat pp200.pythia6_205.above_35gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 2-3 GeV 6/29/05
rcf1231 none pyth.dat pp200.pythia6_205.2_3gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 3-4 GeV 6/30/05
rcf1232 none pyth.dat pp200.pythia6_205.3_4gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 4-5 GeV 6/30/05
rcf1233 none pyth.dat pp200.pythia6_205.4_5gev.cdf_a.y2004y.gheisha_on
# Pythia Special 6/30/05
rcf1234 none pyth.dat pp200.pythia6_205.low_energy.cdf_a.y2004y.gheisha_on
# Pythia min bias 9/06/05
rcf1235 none pyth.dat pp200.pythia6_205.min_bias.cdf_a.y2004y.gheisha_on
# Herwig 5-7 GeV 9/07/05
rcf1236 pp200/herwig6507/pt_5_7 evgen.*.nt pp200.herwig6507.5_7gev.special1.y2004y.gheisha_on
# 62.4 GeV minbias copper
rcf1237 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.minbias.y2005c.gheisha_on
# 100 pi0 per event SVT in
rcf1238 none run1238.kumac pi0.100per_event.200mev_15gev.svtt_on.y2005x.gheisha_on
# 100 pi0 per event SVT out
rcf1239 none run1239.kumac pi0.100per_event.200mev_15gev.svtt_off.y2005x.gheisha_on
# 10 J/psi per event SVT in
rcf1240 none run1240.kumac jpsi.10per_event.500mev_3gev.svtt_on.y2005x.gheisha_on
# 10 J/psi per event SVT out
rcf1241 none run1241.kumac jpsi.10per_event.500mev_3gev.svtt_off.y2005x.gheisha_on
# 62.4 GeV minbias copper low EM cut
rcf1242 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em.y2005c.gheisha_on
# 62.4 GeV minbias copper low EM and keep tracks
rcf1243 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em_keep.y2005c.gheisha_on
# Herwig 9-11 GeV 10/13/05
rcf1244 pp200/herwig6507/pt_9_11 evgen.*.nt pp200.herwig6507.9_11gev.special1.y2004y.gheisha_on
# Herwig 11-15 GeV 10/13/05
rcf1245 pp200/herwig6507/pt_11_15 evgen.*.nt pp200.herwig6507.11_15gev.special1.y2004y.gheisha_on
# Herwig 15-25 GeV 10/13/05
rcf1246 pp200/herwig6507/pt_15_25 evgen.*.nt pp200.herwig6507.15_25gev.special1.y2004y.gheisha_on
# Herwig 25-35 GeV 10/13/05
rcf1247 pp200/herwig6507/pt_25_35 evgen.*.nt pp200.herwig6507.25_35gev.special1.y2004y.gheisha_on
# Herwig 35-45 GeV 10/13/05
rcf1248 pp200/herwig6507/pt_35_45 evgen.*.nt pp200.herwig6507.35_45gev.special1.y2004y.gheisha_on
# 200 GeV minbias
rcf1249 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2005d.gheisha_on
#
# New Herwig Wave
#
# Herwig 9-11 GeV new header 12/14/05
rcf1250 pp200/herwig6507/pt_9_11 evgen.*.nt pp200.herwig6507.9_11gev.special3.y2004y.gheisha_on
# Herwig 11-15 GeV new header 11/10/05
rcf1251 pp200/herwig6507/pt_11_15 evgen.*.nt pp200.herwig6507.11_15gev.special3.y2004y.gheisha_on
# Herwig 15-25 GeV new header 12/19/05
rcf1252 pp200/herwig6507/pt_15_25 evgen.*.nt pp200.herwig6507.15_25gev.special3.y2004y.gheisha_on
# Herwig 25-35 GeV new header 12/19/05
rcf1253 pp200/herwig6507/pt_25_35 evgen.*.nt pp200.herwig6507.25_35gev.special3.y2004y.gheisha_on
# Herwig 35-100 GeV new header 12/19/05
rcf1254 pp200/herwig6507/pt_35_100 evgen.*.nt pp200.herwig6507.35_100gev.special3.y2004y.gheisha_on
# Herwig 2-3 GeV new header 12/14/05
rcf1255 pp200/herwig6507/pt_2_3 evgen.*.nt pp200.herwig6507.2_3gev.special3.y2004y.gheisha_on
# Herwig 3-4 GeV new header 12/14/05
rcf1256 pp200/herwig6507/pt_3_4 evgen.*.nt pp200.herwig6507.3_4gev.special3.y2004y.gheisha_on
# Herwig 4-5 GeV new header 12/21/05
rcf1257 pp200/herwig6507/pt_4_5 evgen.*.nt pp200.herwig6507.4_5gev.special3.y2004y.gheisha_on
# Herwig 5-7 GeV new header 12/21/05
rcf1258 pp200/herwig6507/pt_5_7 evgen.*.nt pp200.herwig6507.5_7gev.special3.y2004y.gheisha_on
# Herwig 7-9 GeV new header 12/21/05
rcf1259 pp200/herwig6507/pt_7_9 evgen.*.nt pp200.herwig6507.7_9gev.special3.y2004y.gheisha_on
#
# Heavy flavor embedding with full calorimeter
rcf1260 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2004y.gheisha_on
# 200 GeV minbias copper
rcf1261 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2006.gheisha_on
# 62.4 GeV minbias copper
rcf1262 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.minbias.y2006.gheisha_on
#
#
# Specialized tracking studies
#
# 62.4 GeV minbias copper low EM and keep tracks
rcf1263 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em_keep.y2005d.gheisha_on
# Same as prev, distortion
rcf1264 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.distort.y2005d.gheisha_on
# Same as prev, distortion with clams
rcf1265 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.clamdist.y2005d.gheisha_on
# Same as prev, clams and two ladders offset
rcf1266 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.clamlad.y2005d.gheisha_on
# Individual ladder offsets
rcf1267 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.indilad.dev2005.gheisha_on
# Global ladder tilts
rcf1268 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.ladtilt.dev2005.gheisha_on
# Individual ladder tilts
rcf1269 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.indtilt.dev2005.gheisha_on
#
#
# Spin PWG requests:
# Pythia Special 2 (CDF A) 45-55 GeV 5/09/06
rcf1270 none pyth.dat pp200.pythia6_205.45_55gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 55-65 GeV 5/10/06
rcf1271 none pyth.dat pp200.pythia6_205.55_65gev.cdf_a.y2004y.gheisha_on
#
rcf1272 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.D0minbias.y2006.gheisha_on
#
# Pythia Special 2 (CDF A) 0-2 GeV 7/20/06
rcf1273 none pyth.dat pp200.pythia6_205.0_2gev.cdf_a.y2004y.gheisha_on
# UPGR02 eta+-1.5
rcf1274 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr02.gheisha_on
#
# Pythia min bias 7/27/06
rcf1275 none pyth.dat pp200.pythia6_205.minbias.cdf_a.y2006.gheisha_on
#
# UPGR05
rcf1276 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr05.gheisha_on
#
# UPGR05 wide diamond (60,300)
rcf1277 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.wide.upgr05.gheisha_on
# UPGR07 wide diamond (60,300)
rcf1278 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.wide.upgr07.gheisha_on
# UPGR07
rcf1279 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr07.gheisha_on
# UPGR01
rcf1280 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr01.gheisha_on
# UPGR08
rcf1281 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr08.gheisha_on
# UPGR06
rcf1282 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr06.gheisha_on
# UPGR09
rcf1283 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr09.gheisha_on
# UPGR09 central
rcf1284 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr09.gheisha_on
# UPGR10
rcf1285 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr10.gheisha_on
# UPGR10 central
rcf1286 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr10.gheisha_on
# UPGR11
rcf1287 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr11.gheisha_on
# UPGR11 central
rcf1288 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr11.gheisha_on
# UPGR06 central
rcf1289 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr06.gheisha_on
# UPGR07
rcf1290 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr07.gheisha_on
Here is the actual version of the file used in the 2007 runs:
e w en b jq geom
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1291 none pyth.dat pp200.pythia6_205.special.diamond10.upgr07.gheisha_on
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1292 none pyth.dat pp500.pythia6_205.special.diamond10.upgr07.gheisha_on
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1293 none pyth.dat pp200.pythia6_205.special.diamond30.upgr07.gheisha_on
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1294 none pyth.dat pp500.pythia6_205.special.diamond30.upgr07.gheisha_on
# Min bias gold, pilot run for 2007
rcf1295 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2007.gheisha_on
# Central auau200 + B-mixing Central auau200 + Upsilon (S1,S2,S3) mixing
rcf1296 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2007.gheisha_on
# Minbias for TUP (wide vertex)
rcf1297 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr13.gheisha_on
#
#
# Central auau200 + D0-mixing, UPGR13
rcf1298 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr13.gheisha_on
# Min bias Pythia
rcf1299 none pyth.dat pp200.pythia6_205.minbias.cdf_a.y2005.gheisha_on
# Pythia, UPGR13
rcf1300 none pyth.dat pp200.pythia6_205.charm.cdf_a.upgr13.gheisha_on
# Pythia wide diamond
rcf1301 none pyth.dat pp200.pythia6_205.minbias.wide.upgr13.gheisha_on
# Pythia
rcf1302 none pyth.dat pp200.pythia6_410.45_55gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1303 none pyth.dat pp200.pythia6_410.35_45gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1304 none pyth.dat pp200.pythia6_410.55_65gev.cdf_a.y2006c.gheisha_on
# Placeholder XXXXXXXXXXX
rcf1305 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2007.gheisha_on
# Pythia
rcf1306 none pyth.dat pp200.pythia6_410.25_35gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1307 none pyth.dat pp200.pythia6_410.15_25gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1308 none pyth.dat pp200.pythia6_410.11_15gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1309 none pyth.dat pp200.pythia6_410.9_11gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1310 none pyth.dat pp200.pythia6_410.7_9gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1311 none pyth.dat pp200.pythia6_410.5_7gev.cdf_a.y2006c.gheisha_on
# Pythia CKIN(3)=7, CKIN(4)=9, CKIN(7)=0.0, CKIN(8)=1.0, CKIN(27)=-0.4, CKIN(28)=0.4
rcf1312 none pyth.dat pp200.pythia6_410.7_9gev.bin1.y2004y.gheisha_on
# Pythia CKIN(3)=9, CKIN(4)=11, CKIN(7)=-0.4, CKIN(8)=1.4, CKIN(27)=-0.5, CKIN(28)=0.6
rcf1313 none pyth.dat pp200.pythia6_410.9_11gev.bin2.y2004y.gheisha_on
# Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=-0.2, CKIN(8)=1.2, CKIN(27)=-0.6, CKIN(28)=-0.3
rcf1314 none pyth.dat pp200.pythia6_410.11_15gev.bin3.y2004y.gheisha_on
# Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=-0.5, CKIN(8)=1.5, CKIN(27)=-0.3, CKIN(28)=0.4
rcf1315 none pyth.dat pp200.pythia6_410.11_15gev.bin4.y2004y.gheisha_on
# Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=0.0, CKIN(8)=1.0, CKIN(27)=0.4, CKIN(28)=0.7
rcf1316 none pyth.dat pp200.pythia6_410.11_15gev.bin5.y2004y.gheisha_on
# Pythia
rcf1317 none pyth.dat pp200.pythia6_410.4_5gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1318 none pyth.dat pp200.pythia6_410.3_4gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1319 none pyth.dat pp200.pythia6_410.minbias.cdf_a.y2006c.gheisha_on
# Pythia
rcf1320 none pyth.dat pp62.pythia6_410.4_5gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1321 none pyth.dat pp62.pythia6_410.3_4gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1322 none pyth.dat pp62.pythia6_410.5_7gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1323 none pyth.dat pp62.pythia6_410.7_9gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1324 none pyth.dat pp62.pythia6_410.9_11gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1325 none pyth.dat pp62.pythia6_410.11_15gev.cdf_a.y2006c.gheisha_on
# Pythia Special 2 (CDF A) 2-3 GeV 6/29/05
pds1231 none pyth.dat pp200.pythia6_205.2_3gev.cdf_a.y2004y.gheisha_on
When submitting jobs on the Grid, most of the functionality in alljobs is redundant. The simplified scripts can be found in the "Grid-Friendly" section of this site.
#! /usr/local/bin/tcsh -f
#
# remove the old list of files
if( -e process.list ) then
rm process.list
endif
#
if( -e filter.kumac ) then
rm filter.kumac
endif
ls gstar.*.fz | sed -e 's/[gstar.|.fz]//g' | sort -n > process.list
#
# clean the trash bin before the next run, re-create
rm -fr trash
mkdir trash
echo `du --block-size=1000K -s | cut -f1` MB in the current directory
echo `df --block-size=1000K . | tail -1 | sed -e 's/\ *[0-9]*\ *[0-9]*\ *//' | sed -e 's/\ .*//g'` MB available on disk
cat<<EOF>>filter.kumac
macro filter name
input='gstar'
mess Start with filenames [input].*.fz, converting to [name]
ag/version batch
option stat
option date
option nbox
filecase keep
pwd =\$shell('pwd');
nfiles=\$shell('cat process.list | wc -l | sed -e "s/[\ ]*//g"');
message Starting to process [nfiles]
* trace on
ve/cr runs([nfiles]) I
ve/read runs process.list
ve/pri runs
if (\$Len([name]).eq.0) then
message cannot define current directory in [pwd]
exit
endif
namz=[name]
out =\$env('OUTDIR')
if ([out].ne.'') then
namz = [out]/[name]/[name]
endif
lenb = 1000
message reading
ve/cr id(3) I
* ve/read id N
message reading complete
nt=[nfiles] | total number of files to process
n1=runs(1) | first input file
n2=runs([nfiles]) | last input file
mm = 0 | number of output files
nn = 0 | number of processed files
cnt = 0 | total number of events in this job
cno = 0 | number of events when output has been opened
nev = 0 | number of events in this output
ii = 0 | input active flag
io = 0 | output active flag
len0= 1200 | minimum output file len
len1= [len0]+200 | average output file len - stop at end-of-file
len2= [len1]+200 | maximum output file len - stop always
ni = [n1] | first input file
no = 0 | skip up to this file
nd = [n1] | file to delete
ntrig = 10
*
if (\$fexist(nn).gt.0) then
ve/read id nn
na=id(1); message [na] input files already done
no=id(2); message first input files up to gstar.[no]
mm=id(3); message first output files up to [name].[mm]
mm=[mm]-1;
endif
*
hist = [name].his
if (\$fexist([hist]).gt.0) then
shell mv [hist] old.his
* call HRGET(0,\$quote([hist]),' ')
endif
ghist [hist]
cdir //pawc
mdir cont
if (\$fexist(old.his).gt.0) then
call HRGET(0,\$quote(old.his),' ')
endif
gfile p gstar.[n1].fz
mode control prin 1 hist 0 | simu 2
gexec ../.lib/control.sl
gexec ../.lib/index.sl
message loaded libs
title=merging runs [n1]-[n2] in [name]
fort/file 66 [name].ps; meta 66 -111
next; dcut cave x .1 10 10 .03 .03
Set DMOD 1; Igset TXFP -60; Igset CHHE .35
ITX 5 19.5 \$quote([title])
ITX .5 .1 \$quote([pwd])
*
* do ni = [ni],[n2]
frst=1
ag/version interactive
do iev=1,1000000000000
* new input file ?
if ([ii].eq.0) then
do nfoo=[frst],[nfiles]
ni = runs([nfoo])
file = [input].[ni].fz
filz = [input].[ni].fz.gz
hist = [input].[ni].his
message processing index [nfoo] out of [nfiles]
ve/print runs([nfoo])
*
if (\$fexist([file]).gt.0) then
message loop with [file]
gfile p [file]
if (\$iquest(1).eq.0) then
ii = 1
nn = [nn]+1
if (\$fexist([hist]).gt.0) then
if (\$hexist(-1).eq.0) then
call HRGET(0,\$quote([hist]),' ')
else
call HRGET(0,\$quote([hist]),'A')
endif
endif
call indmes(\$quote([file]))
goto nextf
* iquest:
endif
* fexist:
endif
enddo
goto nexto
endif
nextf:
* new output file ?
if ([io].eq.0) then
mm = [mm]+1
if ([mm].lt.10) then
output=[namz]_0[mm]
else
output=[namz]_[mm]
endif
io = 1
cno = [cnt]
gfile o [output].fzt
iname = [name]_[mm].fzt
call indmes(\$quote([iname]))
endif
* processing next event
call rzcdir('//SLUGRZ',' ')
trig [ntrig]
evt = \$iquest(99)
if (\$iquest(1).ne.0) then
ni = [ni]+1
frst=[frst]+1
ii = 0
endif
if ([ii].eq.0) goto nexto
* get output file length in MB:
cmd = ls -s [output].fzt
len = \$word(\$shell([cmd]))
len = [len]/[lenb]
* mess wrquest len=[len] ii=[ii] evt=[evt]
if ([len].lt.[len0]) goto nextev
if ([len].lt.[len1] .and. [ii].gt.0) goto nextev
if ([len].lt.[len2] .and. [ii].gt.0 .and. [evt].eq.0) goto nextev
* output file done
nexto:
cnt = \$iquest(100)
if ([cnt]<0) then
cnt = 0
endif
nev = [cnt]-[cno]
io = 0
*
if ([nev].gt.0) then
if ([nev].lt.199999) then
* terminate last event, clear memory
call guout
call gtrigc
gfile o
* rename temp file into the final one:
cmv = mv [output].fzt [output]_[nev]evts.fzd
i = \$shell([cmv])
endif
endif
message files inp = [ni] out = [mm] cnt = [cnt] done
*
if ([ii].eq.0) then
nj = [ni] - 1 | this file was finished, ni is NEXT to read
mj = [mm] + 1 | this is next to start write after the BP
message writing breakpoint [nn] [ni] [mj]
ve/inp id [nn] [ni] [mj]
ve/write id nn i6
ntrig = 10
************************************
* moving files to TRASH
while ([nd].lt.[ni]) do
filed = [input].[nd].fz
alrun = *.[nd].*
if (\$fexist([filed]).gt.0) then
shell mv [alrun] trash/
endif
nd = [nd] + 1
endwhile
************************************
else
ntrig = [ntrig] + 1
endif
if ([ni].gt.[n2]) goto alldone
nextev:
enddo
* control histogram
alldone:
if ([nn].eq.[nt]) then
shell touch filter.done
endif
cdir //pawc
tit = files [n1] - [n2] in set [name]
title_global \$quote([tit])
next; size 20.5 26; zone 2 4;
hi/pl 11; hi/pl 12; hi/pl 13; hi/pl 14
if (\$hexist(1).gt.1) then
n/pl 1.ntrack; n/pl 1.Nvertx; n/pl 1.NtpcHit; n/pl 1.Ntr10
endif
swn 111 0 20 0 20; selnt 111
ITX 2.0 0.1 \$quote([pwd])
close 66; meta 0
physi
exit
return
EOF
echo ------------------------------------------------------------------
echo Activating starsim for dataset $1
$STAR_BIN/starsim -w 1 -g 40 -b ./filter.kumac $1
# cleanup
rm ZEBRA.O process.list nn index paw.metafile *.his *.ps filter.done filter.kumac
$ cvs co StRoot/StarGenerator/macros $ cp StRoot/StarGenerator/macros/starsim.kinematics.C .Running the macro is straightforward. To generate 100 events, simply do...
$ root4star root [0] .L starsim.kinematics.C root [1] int nevents = 100 root [2] starsim(nevents)This will create an "fzd" file, which can be analyzed with the bfc.C macro as you normally would.
geometry("y2012"); command("gkine -4 0"); command("gfile o pythia6.starsim.fzd");
If you're familiar with the starsim interface, you probably recognize the arguements to the command function. These are KUIP commands used to steer the starsim application. You can use the gfile command to set the name of the output file, for example. The "gkine -4 0" command tells starsim how it should get the particles from the event generator (this shouldn't be changed.) Finally, the geometry function defined in the macro allows you to set the geometry tag you wish to simulate. It is the equivalent of the "DETP geom" command in starsim. So you may also pass magnetic field, switch on/off hadronic interactions, etc. Any command which can be executed in starsim can be executed using the "command" function. This enables full control of the physical model, the ability to print out hits, materials, etc... and setup p
Let's take a quick look at the "KINE" event generator and how to configure it. StarKinematics is a special event generator, allowing us to inject particles into the simulation on an event-by-event basis during a simulation run. The "trig" function in this macro loops over a requested number of events, and pushes particles. Let's take a look a this function.
void trig( Int_t n=1 ) { for ( Int_t i=0; i<n; i++ ) { // Clear the chain from the previous event chain->Clear(); // Generate 1 muon in the FGT range kinematics->Kine( 1, "mu-", 10.0, 50.0, 1.0, 2.0 ); // Generate 4 muons flat in pT and eta kinematics->Kine(4, "mu+", 0., 5., -0.8, +0.8 ); // Generate 4 muons according to a PT and ETA distribution kinematics->Dist(4, "mu-", ptDist, etaDist ); // Generate the event chain->Make(); // Print the event primary->event()->Print(); } }
The "kinematics" is a pointer to a StarKinematics object. There are three functions of interest to us:
[ 0| 0| -1] id= 0 Rootino stat=-201 p=( 0.000, 0.000, 0.000, 0.000; 510.000) v=( 0.0000, 0.0000, 0.000) [0 0] [0 0]
[ 1| 1| 1] id= 13 mu- stat=01 p=( 36.421, -7.940, 53.950, 65.576; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 2| 2| 2] id= -13 mu+ stat=01 p=( -2.836, 3.258, 0.225, 4.326; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 3| 3| 3] id= -13 mu+ stat=01 p=( -1.159, -4.437, -2.044, 5.022; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 4| 4| 4] id= -13 mu+ stat=01 p=( -0.091, 1.695, -0.131, 1.706; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 5| 5| 5] id= -13 mu+ stat=01 p=( 1.844, -0.444, 0.345, 1.931; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 6| 6| 6] id= 13 mu- stat=01 p=( 4.228, -4.467, -3.474, 7.065; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 7| 7| 7] id= 13 mu- stat=01 p=( -0.432, -0.657, 0.611, 1.002; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 8| 8| 8] id= 13 mu- stat=01 p=( -0.633, -0.295, -0.017, 0.706; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 9| 9| 9] id= 13 mu- stat=01 p=( 2.767, 0.517, 1.126, 3.034; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
The printout above illustrates the STAR event record. Each row denotes a particle in the simulation. The 0th entry (and any entry with a status code -201) is used to carry summary information about the configuration of the event generator. Multiple event generators can be run in the same simulation, and a Rootino is introduced into the event record to summarize their configuration. The three columns at the left hold the primary event id, the generator event id, and the idtruth id. The next column shows the PDG id of the particle, followed by the particle's name. The particle's staus code is next, followed by the 4-momentum and mass, and the particle's start vertex. Finally, the last four columns denote the primary ids of the 1st and last mother particle and the 1st and last daughter particle.
The STAR event record is saved in a ROOT file at the end of the simulation run, allowing you to read back both particle-wise and event-wise information stored from the event generator and compare with reconstructed events. Here, the idtruth ID of the particle is useful, as it allows you to compare reconstructed tracks and hits with the particle which generated them.
Starsim
STARSIM is the legacy simulation package in STAR, implemented in FORtran, MORtran and utilizing GEANT3 as the concrete MC package. For documentation on how to run simulations using STARSIM, see
The simulation group is evolving the framework towards using Virtual Monte Carlo. As a first step, we have implemented a new event generator framework which will be compatible with the future VMC application. The new framework allows us to run jobs within root4star. In order to run simulations in the new framework, see
,,,
$ cvs co StRoot/StarGenerator/macros/starsim.kinematics.C # check out the macro $ ln -s StRoot/StarGenerator/macros/starsim.kinematics.C starsim.C # create a link named starsim.C $ root4star starsim.C # run the code using STAR's version of ROOTYou're going to see alot of output here, but in the end you'll have two output files: starsim.kinematics.root and starsim.kinematics.fzd
$ ls starsim.kinematics.fzd starsim.kinematics.root starsim.C StRoot/
These two files are the so called "zebra" file (.fzd), containing the Monte Carlo hits, geometry and other associated event information, and the event generator record (.root), containing a ROOT TTree which saves all of the particles generated by the primary event generator.
Once we have the output files, it's time to run them through the reconstruction chain. STAR's reconstruction code is steered using the "big full chain" macro bfc.C. For most jobs you'll want to provide BFC three arguements: the number of events to produce, the set of chain options to run, and an input file. For more complicated tasks you're encouraged to ask questions on the STAR software list.
$ emacs runBfc.C # Feel free to use your favorite editor instead of emacs 0001 | void runBfc() { 0002 | gROOT->LoadMacro("bfc.C"); // Load in BFC macro 0003 | TString _file = "kinematics.starsim.fzd"; // This is our input file 0004 | TString _chain; // We'll build this up 0005 | _chain += "ry2012a "; // Start by specifying the geometry tag (note the trailing space...) 0006 | _chain += "AgML USExgeom "; // Tells BFC which geometry package to use. When in doubt, use agml. 0007 | _chain += "fzin "; // Tells BFC that we'll be reading in a zebra file. 0008 | _chain += "TpcFastSim "; // Runs TPC fast simulation 0009 | _chain += "sti ittf "; // Runs track finding and reconstruction using the "sti" tracker 0010 | _chain += "cmudst "; // Creates the MuDst file for output 0011 | _chain += "geantout "; // Saves the "geant.root" file 0012 | bfc(10, _chain, _file ); // Runs the simulation chain 0013 | } ctrl-x ctrl-s ctrl-x ctrl-q # i.e. save and quit $ root4star runBfc.C # run the reconstruction job $ ls -l ...
If all has gone well, you now have several files in your directory including the MuDst which you'll use in your analysis.
$ ls -1 *.root kinematics.geant.root kinematics.hist.root kinematics.MuDst.root kinematics.runco.root kinematics.starsim.root
C. idTruth and qaTruth
During the first phase of the simulation job we had full access to the state of the simulated particles at every step as they propagated through the STAR detector. As particles propagate through active layers, the simulation package can register "hits" in those sensitive layers. These hits tell us how much energy was deposited, in which layer and at what location. They also save the association between the particle which deposited the energy and the resulting hit. This association is saved as the "idTruth" of the hit. It corresponds to the unique id (primary key) assigned to the particle by the simulation package. This idTruth value is exceedingly useful, as it allows us to compare important information between reconstructed objects and the particles which are responsible for them.
Global and Primary tracks contain two truth variables: idTruth and qaTruth. idTruth tells us which Monte Carlo track was the dominant contributor (i.e. provided the most TPC hits) on the track, while qaTruth tells us the percentage of hits which thath particle provided. With idTruth you can lookup the corresponding Monte Carlo track in the StMuMcTrack branch of the MuDst. In the event that idTruth is zero, no MC particle was responsible for hits on the track.
With the MC track, you can compare the thrown and reconstructed kinematics of the track (pT, eta, phi, etc...).
Primary vertex also contains an idTruth, which can be used to access the Monte Carlo vertex which it corresponds to in the StMuMcVertex branch of the MuDst.
D. Taking it further
In starsim.kinematics.C we use the StarKinematics event generator, which allows you to push particles onto the simulation stack on an event-by-event basis. You can throw them flat in phase space, or sample them from a pT and eta distribution. These methods are illustrated in the macro, which throws muons and pions in the simulation. You can modify this to suit your needs, throwing whatever particles you want according to your own distribtions. The list of available particles can be obtained from StarParticleData.
$ root4star starsim.C\(0\) root [0] StarParticleData &data = StarParticleData::instance(); root [1] data.GetParticles().Print()
Additionally, you can define your own particles. See starsim.addparticle.C.
Primary Event Generation
StMc package create StMcEvent structure and fills it by Monte-Carlo information. Then create StMiniMcEvent structure
which contains both, MonteCarlo & Reconstruction information. Then provide matching MonteCarlo & Reconstruction info.
It allows user to estimate quality of reconstruction and reconstruction efficiency for different physical processes.
Actually, StMcEvent is redundunt, and exists by historical reasons.
StMc consists of:
Archive of old Simulation pages.
Decays
To use Hijing for simulation purposes, one must first run Hijing proper and generate event files, then feed these data to starsim to complete the GEANT part.
The Hijing event generator codes and makefile can be found in the STAR code repository at the following location:$STAR/pams/gen/hijing_382. Once built, the executable is named hijjet.x. The input file is called hijev.inp and should be modified as per user's needs. When the executable is run multiple times in same directory, a series of files will be produced with names like evgen.XXX.nt, where XXX is an integer. The format of the file is PAW Ntuple. The starsim application is equipped to read that format as explained below. If a large number of events are needed, a request should be made to the STAR simulation leader or any member of the S&C.
Listed below is the KUMAC macro that can be used to run your own GEANT simulation with pre-fabricated Hijing events . Unlike the Pythia simulation, events aren't generated on the fly but are read from an external file instead. Look at the comments embedded in the code. Additional notes:
gfile o my_hijing_file.fz detp geom y2006 make geometry gclose all * define a beam with 100um transverse sigma and 60cm sigma in Z vsig 0.01 60.0 * introduce a cut on eta to avoid having to handle massive showers caused by spectators gkine -1 0 0 100 -6.3 6.3 0 6.3 -30.0 30.0 gexec $STAR_LIB/gstar.so us/inp hijing evgen.1.nt * seed the random generator rndm 13 17 * trigger - change to trigger the desired number of times trig 10
gfile o my_pythia_file.fz detp geom y2006 make geometry gclose all * define a beam with 100um transverse sigma and 60cm sigma in Z vsig 0.01 60.0 * Cut on eta (+-6.3) to avoid having to handle massive showers caused by the spectators * Cut on vertex Z (+-30 cm) gkine -1 0 0 100 -6.3 6.3 0 6.29 -30.0 30.0 * load pythia gexec $STAR_LIB/apythia.so * specify parameters ENER 200.0 ! Collision energy MSEL 1 ! Collision type MSUB (11)=1 ! Subprocess choice MSUB (12)=1 MSUB (13)=1 MSUB (28)=1 MSUB (53)=1 MSUB (68)=1 * * Make the following stable: * MDCY (102,1)=0 ! PI0 111 MDCY (106,1)=0 ! PI+ 211 * MDCY (109,1)=0 ! ETA 221 * MDCY (116,1)=0 ! K+ 321 * MDCY (112,1)=0 ! K_SHORT 310 MDCY (105,1)=0 ! K_LONG 130 * * MDCY (164,1)=0 ! LAMBDA0 3122 * MDCY (167,1)=0 ! SIGMA0 3212 MDCY (162,1)=0 ! SIGMA- 3112 MDCY (169,1)=0 ! SIGMA+ 3222 MDCY (172,1)=0 ! Xi- 3312 MDCY (174,1)=0 ! Xi0 3322 MDCY (176,1)=0 ! OMEGA- 3334 * seed the random generator rndm 13 19 * trigger - change to trigger the desired number of times trig 10
gexec $STAR_LIB/apythia.soWith:
gexec $STAR_LIB/libpythia_6410.so gexec $STAR_LIB/bpythia.so
It is possible to simulate the production and propagation of the magnetic monopoles in the STAR experiment, using a few modification in the code base of GEANT 3.21, and in particular in our GEANT-derived application, the starsim. Our work is based on a few papers, including:
The flow of the GEANT code execution is illustrated by the following diagrams from the above publication:
Now, let's take a look at a minimum bias gold-gold event that contains a pair of magnetic monopoles:
Salient features can already be seen in these graphics: large dE/dx losses and characteristic limit on the maximum radius of the recorded monopole track (this is due to the fact that the trajectory of the mm is not helix-like, but rather parabole-like). Now, lets take a look at the phi distribution of the hits, for central and peripheral gold-gold events containing monopoles:
Again, the rather intuitive feature (large peaks in phi due to a very large dE/dx produced by the monopoles) is obviously borne out in the simulation.
This is work in progress and this page is subjec to updates.
#!/usr/bin/ksh
echo commencing the simulation
export STAR=.
echo STAR=$STAR
#
run=$1
geom=Y2006C
ntrig=2000
diamond=60
z=120
# >> run.$run.log 2>&1
node=`uname -n`
echo run:$run geom:$geom ntrig:$ntrig diamond:$diamond z:$z node:$node pid:$$
./starsim -w 0 -g 40 -c trace on .<<EOF
trace on
RUNG $run 1 $$
RNDM $$ $run
gfile o gstar.$run.fz
detp geom $geom
vsig 0.01 $diamond
gexec $STAR/geometry.so
gexec $STAR/libpythia_6410.so
gexec $STAR/bpythia.so
gclose all
gkine -1 0 0 100 -6.3 6.3 0 6.28318 -$z $z
ENER 200.0
MSEL 1
CKIN 3=4.0
CKIN 4=5.0
MSTP (51)=7
MSTP (81)=1
MSTP (82)=4
PARP (82)=2.0
PARP (83)=0.5
PARP (84)=0.4
PARP (85)=0.9
PARP (86)=0.95
PARP (89)=1800
PARP (90)=0.25
PARP (91)=1.0
PARP (67)=4.0
MDCY (102,1)=0 ! PI0 111
MDCY (106,1)=0 ! PI+ 211
MDCY (109,1)=0 ! ETA 221
MDCY (116,1)=0 ! K+ 321
MDCY (112,1)=0 ! K_SHORT 310
MDCY (105,1)=0 ! K_LONG 130
MDCY (164,1)=0 ! LAMBDA0 3122
MDCY (167,1)=0 ! SIGMA0 3212
MDCY (162,1)=0 ! SIGMA- 3112
MDCY (169,1)=0 ! SIGMA+ 3222
MDCY (172,1)=0 ! Xi- 3312
MDCY (174,1)=0 ! Xi0 3322
MDCY (176,1)=0 ! OMEGA- 3334
trig $ntrig
exit
EOF
143575 2007-05-31 18:02:47 agetof
65743 2007-05-31 18:02:39 agetof.def
44591 2007-05-31 19:05:34 bpythia.so
5595692 2007-05-31 18:03:10 geometry.so
183148 2007-05-31 18:03:15 gstar.so
4170153 2007-05-31 19:05:27 libpythia_6410.so
0 2007-05-31 18:00:06 StarDb/
0 2007-05-31 18:00:59 StarDb/StMagF/
51229 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_negative_2D.dat
2775652 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_negative_3D.dat
51227 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_positive_2D.dat
2775650 2007-05-31 18:00:58 StarDb/StMagF/bfield_full_positive_3D.dat
51227 2007-05-31 18:00:58 StarDb/StMagF/bfield_half_positive_2D.dat
2775650 2007-05-31 18:00:58 StarDb/StMagF/bfield_half_positive_3D.dat
1530566 2007-05-31 18:00:59 StarDb/StMagF/boundary_13_efield.dat
51231 2007-05-31 18:00:59 StarDb/StMagF/const_full_positive_2D.dat
1585050 2007-05-31 18:00:59 StarDb/StMagF/endcap_efield.dat
1530393 2007-05-31 18:00:59 StarDb/StMagF/membrane_efield.dat
15663993 2007-05-31 18:03:31 starsim
36600 2007-05-31 18:03:37 starsim.bank
1848 2007-05-31 18:03:42 starsim.logon.kumac
21551 2007-05-31 18:03:48 starsim.makefile
As of spring of 2007, the Monte Carlo production is being run on three different platforms:
StarVMC/StarVMCApplication:
Example of setting the input file: StBFChain::ProcessLine ((StVMCMaker *) 0xaeab6f0)->SetInputFile("/star/simu/simu/gstardata/evgenRoot/evgen.3.root");
In general, StBFChain sets various attributes of the makers.
New chain options must be added in BigFullChain.h