Simulation

Welcome to the Simulation Pages!

Please note that most of the material posted before October 2006 is located at the older web site which we'll keep for reference, for the time being. See sections below for most recent additions and information.

For making a new simulation production request, please consult the STAR Simulations Requests interface.

 

 

Adding a New Detector to STAR

The STAR Geometry Model

The STAR Geometry is implented in geant 3, which provides the geometry description to STAR's Monte Carlo application, starsim.   The geant3 model is
implemented using the Advanced Geometry Interface for GSTAR language.   AGI provides a flexible and robust framework in which detector
geometries can be quickly implemented.  STAR is currently migrating from the AGI language to a related framework called AgML.  AgML stands for
"Another Geometry Modelling Language."  It is based on XML, and is the preferred language in which new geomtries should be implemented.   AgML
provides backwards compatability with the AGI language, in order to continue supporting the starsim application as we transition to a new STAR virtual
Monte Carlo application. 

Geometry Definition

Users wishing to develop and integrate new detector models into the STAR framework will be intersted in the following links:

Tracking Interface (Stv)

Exporting detector hits

  1. Implement a hit class based on StEvent/StHit
  2. Implement a hit collection
  3. Implement an iterator over your hit collection based on StEventUtilities/StHitIter
  4. Add your hit iterator to the StEventUtitlies/StEventHitIter
     

 

Implementing a custom seed finder

ID Truth

ID truth is an ID which enables us to determine which simulated particle was principally responsible for the creation of a hit in a detector, and eventually the physics objects (clusters, points, tracks) which are formed from them.  The StHit class has a member function which takes two arguements:

  • idTru -- the primary key of the simulated particle, i.e. "track_p" in the g2t hit structure
  • qaTru -- the quality of the truth value, defined as the dominant contributor to the hit, cluster, point or track.
Implementation of ID truth begins in the slow simulator.  Here, every hit which is created should have an ID truth value assigned. 

When hits are built up into clusters, the clustering algorithm should set the idtruth value for the cluster based on the dominant contributor of the hits which make up the cluster.

When clusters are associated into space points, the point finding algorithm should set the idtruth value for the point.  In the event that two clusters are combined with two different idTruth values, you should set idTruth = 0.


Interface to Starsim

The interface between starsim and reconstruction is briefly outlined here

  • You do not have access to view this node 

Information about geometries used in production and which geometries to use in simulations may be found in the following links:

  • Existing Geometry Tags used in Production
  • The STAR Geometry in simulation & reconstruction contains useful information on the detector configurations associated with a unique geometry tag.  Production geometry tags state the year for which the tag is valid, and a letter indicating the revision level of the geometry.  For example, "y2009c" indicates the third revision of the 2009 configuration of the STAR detector.  Users wishing to run starsim in their private areas are encouraged to use the most recent revision for the year in which they want to compare to data.

Comparisons between the original AgSTAR model and the new AgML model of the detector may be found here:

AgML Project Overview and Readiness for 2012




HOWTO Use Geometries defined in AgML in STARSIM
AgML geometries are available for use in simulation using the "eval" libraries. 
$ starver eval
The geometries themselves are available in a special library, which is setup for backwards compatability with starsim.  To use the geometries you load the "xgeometry.so" library in a starsim session, either interactively or in a macro:
starsim> detp geom y2012

starsim> gexe $STAR_LIB/xgeometry.so
starsim> gclos all
 
HOWTO Use Geometries defined in AgML in the Big Full Chain
AgML geometries may also be used in reconstruction.  To access them, the "agml" flag should be provided in the chain being run:
e.g
 
root [0] .L bfc.C
root [1] bfc(nevents,"y2012 agml ...", inputFile);

 

Geometry in Preparation: y2012

Major changes:

1. Support cone, ftpc, ssd, pmd removed.
2. Inner Detector Support Module (IDSM) added                                                                  
3. Forward GEM Tracker (FGTD) added
 
Use of AgML geometries within starsim:
 
$ starver eval
$ starsim
starsim> detp geom y2012
starsim> gexe $STAR_LIB/xgeometry.so
starsim> gclos all
 
Use of AgML geometries within the big full chain:
$ root4star
root [0] .L bfc.C
root [1] bfc(0,"y2012 agml ...",inputFile);
 

Current (10/24/2011) configuration of the IDSM with FGT inside --

 

 

Page maintained by Jason Webb <jwebb@bnl.gov>

 

AgML Example: The Beam Beam Counters

  1. <Document  file="StarVMC/Geometry/BbcmGeo/BbcmGeo.xml">
  2. <!--
  3.  Every AgML document begins with a Document tag, which takes a single "file"
  4.  attribute as its arguement.
  5.  
  6.  -->
  7.  
  8.  
  9. <Module name="BbcmGeo" comment=" is the Beam Beam Counter Modules GEOmetry "  >
  10. <!--
  11.  The Module tag declares an AgML module.  The name should consist of a four
  12.  letter acronym, followed by the word "geo" and possibly a version number.
  13.  
  14.  e.g. BbcmGeo, EcalGeo6, TpceGeo3a, etc...
  15.  
  16.  A mandatory comment attribute provides a short description of which detector
  17.  is implemented by the module.
  18.  
  19.  -->
  20.  
  21.   <Created date="15 march 2002"   />
  22.   <Author  name="Yiqun Wang"      />
  23.   <!-- The Created and Author tags accept a free-form date and author, for the
  24.       purposes of documentation. -->
  25.  
  26.  
  27.   <CDE>AGECOM,GCONST,GCUNIT</CDE>
  28.   <!-- The CDE tag provides some backwards compatability features with starsim.
  29.       AGECOM,GCCONST and GCUNIT are fine for most modules. -->
  30.        
  31.   <Content>BBCM,BBCA,THXM,SHXT,BPOL,CLAD</Content>
  32.   <!-- The Content tag should declare the names of all volumes which are
  33.       declared in the detector module.  A comma-separated list.  -->
  34.        
  35.   <Structure name="BBCG"  >
  36.     <var name="version"  />
  37.     <var name="onoff(3)" />
  38.     <var name="zdis(2)"  />
  39.   </Structure>
  40.   <!-- The structure tag declares an AgML structure.  It is similar to a c-
  41.       struct, but has some important differences which will be illustrated
  42.       later.  The members of a Structure are declared using the var tag.  By
  43.       default, the type of a var will be a float.
  44.  
  45.       Arrays are declared by enclosing the dimensions of the array in
  46.       parentheses.  Only 1D and 2D arrayes are supported.  e.g.
  47.  
  48.       <var name="x(3)"     />   allowed
  49.       <var name="y(3,3)"   />   allowed
  50.       <var name="z(4,4,4)" />   not allowed
  51.  
  52.       Types may be declared explicitly using the type parameter as below.  
  53.       Valid types are int, float and char.  char variables should be limited
  54.       to four-character strings for backwards compatability with starsim.  
  55.       Arrays of chars are allowed, in which case you may treat the variable
  56.       as a string of length Nx4, where N is the dimension of the array.
  57.  
  58.       -->
  59.        
  60.   <Structure name="HEXG">
  61.     <var name="type"    type="float"  />
  62.     <var name="irad"    type="float"  />
  63.     <var name="clad"    type="float"  />
  64.     <var name="thick"   type="float"  />
  65.     <var name="zoffset" type="float"  />
  66.     <var name="xoffset" type="float"  />
  67.     <var name="yoffset" type="float"  />
  68.   </Structure>
  69.        
  70.   <varlist type="float">
  71.      actr,srad,lrad,ztotal,x0,y0,theta0,phi0,xtrip,ytrip,rtrip,thetrip,rsing,thesing
  72.   </varlist>
  73.   <!-- The varlist tag allows you to declare a list of variables of a stated type.
  74.       The variables will be in scope for all volumes declared in the module.
  75.  
  76.       Variables may be initialized using the syntax
  77.            var1/value1/ , var2/value2/, var3, var4/value4/ ...
  78.  
  79.       Arrays of 1 or 2 dimensions may also be declared.  The Assign tag may
  80.       be used to assign values to the arrays:
  81.  
  82.       <Assign var="ARRAY" value="{1,2,3,4}" />
  83.       -->
  84.        
  85.   <varlist type="int">I_trip/0/,J_sing/0/</varlist>
  86.        
  87.   <Fill  name="BBCG"    comment="BBC geometry">
  88.     <var name="Version" value="1.0"              comment=" Geometry version "  />
  89.     <var name="Onoff"   value="{3,3,3}"          comment=" 0 off, 1 west on, 2 east on, 3 both on: for BBC,Small tiles,Large tiles "  />
  90.     <var name="zdis"    value="{374.24,-374.24}" comment=" z-coord from center in STAR (715/2+6*2.54+1=373.8) "  />
  91.   </Fill>
  92.   <!-- The members of a structure are filled inside of a Fill block.  The Fill
  93.       tag specifies the name of the structure being filled, and accepts a
  94.       mandatory comment for documentation purposes.
  95.  
  96.       The var tag is used to fill the members of the structure.  In this
  97.       context, it accepts three arguements:  The name of the structure member,
  98.       the value which should be filled, and a mandatory comment for
  99.       documentation purposes.
  100.      
  101.       The names of variables, structures and structure members are case-
  102.       insensitive.
  103.  
  104.       1D Arrays are filled using a comma separated list of values contained in
  105.       curly brackets...
  106.  
  107.       e.g. value="{1,2,3,4,5}"
  108.  
  109.       2D Arrays are filled using a comma and semi-colon separated list of values
  110.  
  111.       e.g. value="{11,12,13,14,15;        This fills an array dimensioned
  112.                    21,22,23,24,25;        as A(3,5)
  113.                    31,32,33,34,35;}"
  114.  
  115.       -->
  116.        
  117.  
  118.   <Fill name="HEXG" comment="hexagon tile geometry"  >
  119.     <var name="Type"    value="1"     comment="1 for small hex tile, 2 for large tile "  />
  120.     <var name="irad"    value="4.174" comment="inscribing circle radius =9.64/2*sin(60)=4.174 "  />
  121.     <var name="clad"    value="0.1"   comment="cladding thickness "  />
  122.     <var name="thick"   value="1.0"   comment="thickness of tile "  />
  123.     <var name="zoffset" value="1.5"   comment="z-offset from center of BBCW (1), or BBCE (2) "  />
  124.     <var name="xoffset" value="0.0"   comment="x-offset center from beam for BBCW (1), or BBCE (2) "  />
  125.     <var name="yoffset" value="0.0"   comment="y-offset center from beam for BBCW (1), or BBCE (2) "  />
  126.   </Fill>
  127.        
  128.   <Fill name="HEXG" comment="hexagon tile geometry"  >
  129.     <var name="Type"    value="2"      comment="1 for small hex tile, 2 for large tile "  />
  130.     <var name="irad"    value="16.697" comment="inscribing circle radius (4x that of small one) "  />
  131.     <var name="clad"    value="0.1"    comment="cladding of tile "  />
  132.     <var name="thick"   value="1.0"    comment="thickness of tile "  />
  133.     <var name="zoffset" value="-1.5"   comment="z-offset from center of BBCW (1), or BBCE (2) "  />
  134.     <var name="xoffset" value="0.0"    comment="x-offset center from beam for BBCW (1), or BBCE (2) "  />
  135.     <var name="yoffset" value="0.0"    comment="y-offset center from beam for BBCW (1), or BBCE (2) "  />
  136.   </Fill>
  137.        
  138.   <Use struct="BBCG"/>
  139.   <!-- An important difference between AgML structures and c-structs is that
  140.       only one instance of an AgML structure is allowed in a geometry module,
  141.       and there is no need for the user to create it... it is automatically
  142.       generated.  The Fill blocks store multiple versions of this structure
  143.       in an external name space.  In order to access the different versions
  144.       of a structure, the Use tag is invoked.
  145.      
  146.       Use takes one mandatory attribute: the name of the structure to use.  
  147.       By default, the first set of values declared in the Fill block will
  148.       be loaded, as above.
  149.  
  150.       The Use tag may also be used to select the version of the structure
  151.       which is loaded.
  152.  
  153.       Example:
  154.          <Use struct="hexg" select="type" value="2" />
  155.  
  156.       The above example loads the second version of the HEXG structure
  157.       declared above.
  158.      
  159.       NOTE: The behavior of a structure is not well defined before the
  160.             Use operator is applied.
  161.      
  162.       -->
  163.  
  164.  
  165.   <Print level="1" fmt="'BBCMGEO version ', F4.2"  >
  166.     bbcg_version  
  167.   </Print>
  168.   <!-- The Print statement takes a print "level" and a format descriptor "fmt".  The
  169.       format descriptor follows the Fortran formatting convention
  170.  
  171.       (n.b. Print statements have not been implemented in ROOT export
  172.             as they utilize fortran format descriptors)
  173.    -->
  174.      
  175.                
  176.   <!-- small kludge x10000 because ROOT will cast these to (int) before computing properties -->
  177.   <Mixture name="ALKAP" dens="1.432"  >
  178.     <Component name="C5" a="12" z="6"  w="5      *10000"  />
  179.     <Component name="H4" a="1"  z="1"  w="4      *10000"  />
  180.     <Component name="O2" a="16" z="8"  w="2      *10000"  />
  181.     <Component name="Al" a="27" z="13" w="0.2302 *10000"  />
  182.   </Mixture>
  183.   <!-- Mixtures and Materials may be declared within the module... this one is not
  184.       a good example, as there is a workaround being used to avoid some issues
  185.       with ROOT vs GEANT compatability. -->
  186.  
  187.  
  188.   <Use struct="HEXG" select="type" value="1 "  />
  189.      srad   = hexg_irad*6.0;
  190.      ztotal = hexg_thick+2*abs(hexg_zoffset);
  191.  
  192.   <Use struct="HEXG" select="type" value="2 "  />
  193.      lrad   = hexg_irad*6.0;
  194.      ztotal = ztotal+hexg_thick+2*abs(hexg_zoffset);  <!-- hexg_zoffset is negative for Large (type=2) -->
  195.  
  196.   <!-- AgML has limited support for expressions, in the sense that anyhing which
  197.       is not an XML tag is passed (with minimal parsing) directly to the c++
  198.       or mortran compiler.  A few things are notable in the above lines.
  199.  
  200.       (1) Lines may be optionally terminated by a ";", but...
  201.       (2) There is no mechanism to break long lines across multiple lines.
  202.       (3) The members of a structure are accessed using an "_", i.e.
  203.  
  204.           hexg_irad above refers to the IRAD member of the HEXG structure
  205.           loaded by the Use tag.
  206.  
  207.       (4) Several intrinsic functions are available: abs, cos, sin, etc...
  208.       -->
  209.  
  210.   <Create block="BBCM"  />
  211.   <!-- The Create operator creates the volume specified in the  "block"
  212.       parameter.  When the Create operator is invoked, execution branches
  213.       to the block of code for the specified volume.   In this case, the
  214.       Volume named BBCM below. -->
  215.  
  216.   <If expr="bbcg_OnOff(1)==1|bbcg_OnOff(1)==3">  
  217.  
  218.     <Placement block="BBCM" in="CAVE"
  219.               x="0"
  220.               y="0"
  221.               z="bbcg_zdis(1)"/>
  222.     <!-- After the volume has been Created, it is positioned within another
  223.         volume in the STAR detector.  The mother volume may be specified
  224.         explicitly with the "in" attribute.
  225.  
  226.         The position of the volume is specified using x, y and z attributes.
  227.  
  228.         An additional attribute, konly, is used to indicate whether or
  229.         not the volume is expected to overlap another volume at the same
  230.         level in the geometry tree.  konly="ONLY" indicates no overlap and
  231.         is the default value.  konly="MANY" indicates overlap is possible.
  232.  
  233.         For more info on ONLY vs MANY, consult the geant 3 manual.        
  234.         -->
  235.  
  236.   </If>
  237.        
  238.   <If expr="bbcg_OnOff(1)==2|bbcg_OnOff(1)==3"  >
  239.     <Placement block="BBCM" in="CAVE"
  240.               x="0"
  241.               y="0"
  242.               z="bbcg_zdis(2)">
  243.       <Rotation alphay="180"  />
  244.     </Placement>            
  245.     <!-- Rotations are specified as additional tags contained withn a
  246.         Placement block of code.  The translation of the volume will
  247.         be performed first, followed by any rotations, evaluated in
  248.         the order given. -->
  249.  
  250.  
  251.   </If>
  252.        
  253.   <Print level="1" fmt="'BBCMGEO finished'"></Print>
  254.        
  255.  
  256. <!--
  257.  
  258.  Volumes are the basic building blocks in AgML.  The represent the un-
  259.  positioned elements of a detector setup.  They are characterized by
  260.  a material, medium, a set of attributes, and a shape.
  261.  
  262.  -->
  263.  
  264.  
  265.  
  266. <!--                      === V o l u m e  B B C M ===                      -->
  267. <Volume name="BBCM" comment="is one BBC East or West module">
  268.  
  269.   <Material  name="Air" />
  270.   <Medium    name="standard"  />
  271.   <Attribute for="BBCM" seen="0" colo="7"  />
  272.   <!-- The material, medium and attributes should be specified first.  If
  273.       ommitted, the volume will inherit the properties of the volume which
  274.       created it.
  275.  
  276.       NOTE: Be careful when you reorganize a detector module.  If you change
  277.             where a volume is created, you potentially change the properties
  278.          which that volume inherits.
  279.  
  280.   -->
  281.  
  282.   <Shape type="tube"
  283.      rmin="0"
  284.      rmax="lrad"
  285.      dz="ztotal/2" />
  286.   <!-- After specifying the material, medium and/or attributes of a volume,
  287.       the shape is specified.  The Shape is the only property of a volume
  288.       which *must* be declared.  Further, it must be declared *after* the
  289.       material, medium and attributes.
  290.  
  291.       Shapes may be any one of the basic 16 shapes in geant 3.  A future
  292.       release will add extrusions and composite shares to AgMl.
  293.  
  294.       The actual volume (geant3, geant4, TGeo, etc...) will be created at
  295.       this point.
  296.       -->
  297.          
  298.   <Use struct="HEXG" select="type" value="1 "  />
  299.  
  300.   <If expr="bbcg_OnOff(2)==1|bbcg_OnOff(2)==3"  >
  301.     <Create    block="BBCA"  />
  302.     <Placement block="BBCA" in="BBCM"
  303.            x="hexg_xoffset"
  304.            y="hexg_yoffset"
  305.            z="hexg_zoffset"/>    
  306.   </If>
  307.    
  308.   <Use struct="HEXG" select="type" value="2 "  />
  309.  
  310.   <If expr="bbcg_OnOff(3)==1|bbcg_OnOff(3)==3"  >
  311.  
  312.     <Create block="BBCA"/>
  313.     <Placement block="BBCA" in="BBCM"
  314.            x="hexg_xoffset"
  315.            y="hexg_yoffset"
  316.            z="hexg_zoffset"/>
  317.      
  318.   </If>
  319.    
  320. </Volume>
  321.  
  322. <!--                      === V o l u m e  B B C A ===                      -->
  323. <Volume name="BBCA" comment="is one BBC Annulus module"  >
  324.   <Material name="Air"  />
  325.   <Medium name="standard"  />
  326.   <Attribute for="BBCA" seen="0" colo="3"  />
  327.   <Shape type="tube" dz="hexg_thick/2" rmin="hexg_irad" rmax="hexg_irad*6.0"  />
  328.  
  329.   x0=hexg_irad*tan(pi/6.0)
  330.   y0=hexg_irad*3.0
  331.   rtrip = sqrt(x0*x0+y0*y0)
  332.   theta0 = atan(y0/x0)
  333.  
  334.   <Do var="I_trip" from="0" to="5"  >
  335.    
  336.     phi0 = I_trip*60
  337.     thetrip = theta0+I_trip*pi/3.0
  338.     xtrip = rtrip*cos(thetrip)
  339.     ytrip = rtrip*sin(thetrip)
  340.      
  341.     <Create block="THXM"  />
  342.     <Placement in="BBCA" y="ytrip" x="xtrip" z="0" konly="'MANY'" block="THXM"  >
  343.       <Rotation thetaz="0" thetax="90" phiz="0" phiy="90+phi0" phix="phi0"  />
  344.     </Placement>
  345.      
  346.      
  347.   </Do>
  348.    
  349.    
  350. </Volume>
  351.  
  352. <!--                      === V o l u m e  T H X M ===                      -->
  353. <Volume name="THXM" comment="is on Triple HeXagonal Module"  >
  354.   <Material name="Air"  />
  355.   <Medium name="standard"  />
  356.   <Attribute for="THXM" seen="0" colo="2"  />
  357.   <Shape type="tube" dz="hexg_thick/2" rmin="0" rmax="hexg_irad*2.0/sin(pi/3.0)"  />
  358.  
  359.   <Do var="J_sing" from="0" to="2"  >
  360.    
  361.     rsing=hexg_irad/sin(pi/3.0)
  362.     thesing=J_sing*pi*2.0/3.0
  363.     <Create block="SHXT"  />
  364.     <Placement y="rsing*sin(thesing)" x="rsing*cos(thesing)" z="0" block="SHXT" in="THXM"  >
  365.     </Placement>
  366.    
  367.    
  368.   </Do>
  369.  
  370. </Volume>
  371.  
  372.  
  373. <!--                      === V o l u m e  S H X T ===                      -->
  374. <Volume name="SHXT" comment="is one Single HeXagonal Tile"  >
  375.   <Material name="Air"  />
  376.   <Medium name="standard"  />
  377.   <Attribute for="SHXT" seen="1" colo="6"  />
  378.   <Shape type="PGON" phi1="0" rmn="{0,0}" rmx="{hexg_irad,hexg_irad}" nz="2" npdiv="6" dphi="360" zi="{-hexg_thick/2,hexg_thick/2}"  />
  379.  
  380.   actr = hexg_irad-hexg_clad
  381.  
  382.   <Create block="CLAD"  />
  383.   <Placement y="0" x="0" z="0" block="CLAD" in="SHXT"  >
  384.   </Placement>
  385.  
  386.   <Create block="BPOL"  />
  387.   <Placement y="0" x="0" z="0" block="BPOL" in="SHXT"  >
  388.   </Placement>
  389.  
  390.  
  391. </Volume>
  392.  
  393.  
  394. <!--                      === V o l u m e  C L A D ===                      -->
  395. <Volume name="CLAD" comment="is one CLADding of BPOL active region"  >
  396.   <Material name="ALKAP"  />
  397.   <Attribute for="CLAD" seen="1" colo="3"  />
  398.   <Shape type="PGON" phi1="0" rmn="{actr,actr}" rmx="{hexg_irad,hexg_irad}" nz="2" npdiv="6" dphi="360" zi="{-hexg_thick/2,hexg_thick/2}"  />
  399.  
  400. </Volume>
  401.  
  402.  
  403. <!--                      === V o l u m e  B P O L ===                      -->
  404. <Volume name="BPOL" comment="is one Bbc POLystyren active scintillator layer"  >
  405.  
  406.   <Material name="POLYSTYREN"  />
  407.   <!-- Reference the predefined material polystyrene -->
  408.  
  409.   <Material name="Cpolystyren" isvol="1"  />
  410.   <!-- By specifying isvol="1", polystyrene is copied into a new material
  411.       named Cpolystyrene.  A new material is introduced here in order to
  412.       force the creation of a new medium, which we change with parameters
  413.       below. -->
  414.  
  415.   <Attribute for="BPOL" seen="1" colo="4"  />
  416.   <Shape type="PGON" phi1="0" rmn="{0,0}" rmx="{actr,actr}" nz="2" npdiv="6" dphi="360" zi="{-hexg_thick/2,hexg_thick/2}"  />
  417.  
  418.   <Par name="CUTGAM" value="0.00008"  />
  419.   <Par name="CUTELE" value="0.001"  />
  420.   <Par name="BCUTE"  value="0.0001"  />
  421.   <Par name="CUTNEU" value="0.001"  />
  422.   <Par name="CUTHAD" value="0.001"  />
  423.   <Par name="CUTMUO" value="0.001"  />
  424.   <Par name="BIRK1"  value="1.000"  />
  425.   <Par name="BIRK2"  value="0.013"  />
  426.   <Par name="BIRK3"  value="9.6E-6"  />
  427.   <!--
  428.    Parameters are the Geant3 paramters which may be set via a call to
  429.    GSTPar.
  430.    -->
  431.  
  432.   <Instrument block="BPOL">
  433.     <Hit meas="tof"  nbits="16" opts="C" min="0" max="1.0E-6" />
  434.     <Hit meas="birk" nbits="0"  opts="C" min="0" max="10"     />
  435.   </Instrument>
  436.   <!-- The instrument block indicates what information should be saved
  437.       for this volume, and how the information should be packed. -->
  438.  
  439. </Volume>
  440.  
  441.  
  442. </Module>
  443. </Document>
  444.  
  445.  

AgML Tutorials

Getting started developing geometries for the STAR experiment with AgML.

Setting up your local environment

You need to checkout several directories and complie in this order:
$ cvs co StarVMC/Geometry
$ cvs co StarVMC/StarGeometry
$ cvs co StarVMC/xgeometry
$ cvs co pams/geometry
$ cons +StarVMC/Geometry
$ cons


This will take a while to compile, during which time you can get a cup of coffee, or do your laundry, etc...

If you only want to visualize the STAR detector, you can checkout:

$ cvs co StarVMC/Geometry/macros

Once this is done you can visualize STAR geometries using the viewStarGeometry.C macro in AgML 1, and the loadAgML.C macro in AgML 2.0.

$ root.exe
root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C root [1] nocache=true root [2] viewall=true root [3] viewStarGeometry("y2012") 
root [0] .L StarVMC/Geometry/macros/loadAgML.C
root [1] loadAgML("y2016")
root [2] TGeoVolume *cave = gGeoManager->FindVolumeFast("CAVE");
root [3] cave -> Draw("ogl");              // ogl uses open GL viewer


Tutorial #1 -- Creating and Placing Volumes

Start by firing up your favorite text editor... preferably something which does syntax highlighting and checking on XML documents.  Edit the first tutorial geometries located in StarVMC/Geometry/TutrGeo ...

$ emacs StarVMC/Geometry/TutrGeo/TutrGeo1.xml

This module illustrates how to create a new detector module, how to create and place a simple volume, and how to create and place multiple copies of that volume.  Next, we need to attach this module to a geometry model in order to visualize it.  Geometry models (or "tags") are defined in the StarGeo.xml file. 
 

$ emacs StarVMC/Geometry/StarGeo.xml

There is a simple geometry, which only defines the CAVE.  It's the first geometry tag called "black hole".  You can add your detector here...
 

xxx


$ root.exe

root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C root [1] nocache=true root [2] viewStarGeometry("test","TutrGeo1");

The "test" geometry tag is a very simple geometry, implementing only the wide angle hall and the cave.  All detectors, beam pipes, magnets, etc... have been removed.  The second arguement to viewStarGeometry specifies which geometry module(s) are to be built and added to the test geometry.  In this case we add only TutrGeo1.  (A comma-separated list of geometry modules could be provided, if more than one geometry module was to be built).

Now you can try modifying TutrGeo1.  Feel free to add as many boxes in as many positions as you would like.  Once you have done this, recompile in two steps

$ cons +StarVMC/Geometry
$ cons

Tutorial #2 -- A few simple shapes, rotations and reflections

The second tutorial geometry is in StarVMC/Geometry/TutrGeo/TutrGeo2.xml.  Again, view it using viewStarGeometry.C

$ root.exe
root [0] .L viewStarGeometry.C
root [1] nocache=true
root [2] viewStarGeometry("test","TutrGeo2")

What does the nocache=true statement do?  It instructs viewStarGeometry.C to recreate the geometry, rather than load it from a root file created the last time you ran the geometry.  By default, if the macro finds a file name "test.root", it will load the geometry from that file to save time.  You don't want this since you know that you've changed the geometry. 

The second tutorial illustrates a couple more simple shapes:  cones and tubes.  It also illustrates how to create reflections.  Play around with the code a bit, recompile in the normal manner, then try viewing the geometry again.

Tutorial #3 -- Variables and Structures

AgML provides variables and structures.  The third tutorial is in StarVMC/Geometry/TutrGeo/TutrGeo3.xml.  Open this up in a text editor and let's look at it.   We define three variables: boxDX, boxDY and boxDZ to hold the dimensions of the box we want to create.  AgML is case-insensitve, so you can write this as boxdx, BoxDY and BOXDZ if you so choose.  In general, choose what looks best and helps you keep track of the code you're writing.

Next check out the volume "ABOX".  Note how the shape's dx, dy and dz arguements now reference the variables boxDX, boxDY and boxDZ.  This allows us to create multiple versions of the volume ABOX.  Let's view the geometry and see.

$ root.exe
root [0] .L StarVMC/Geometry/macros/viewStarGeometry.C
root [1] nocache=true
root [2] viewStarGeometry("test","TutrGeo3")

Launch a new TBrowser and open the "test" geometry.  Double click test --> Master Volume --> CAVE --> TUTR.  You now see all of the concrete volumes which have been created by ROOT.  It should look like what you see at the right.  We have "ABOX", but we also have ABO1 and ABO2.  This demonstrates the an important concept in AgML.  Each <Volume ...> block actually defines a volume "factory".  It allows you to create multiple versions of a volume, each differing by the shape of the volume.  When the shape is changed, a new volume is created with a nickname, where the last letter in the volume name is replaced by [1 2 3 ... 0 a b c ... z] (then the second to last letter, then the third...). 

Structures provide an alternate means to define variables.  In order to populate the members of a structure with values, you use the Fill statement.  Multiple fill statements for a given structure may be defined, providing multiple sets of values.  In order to select a given set of values, the <Use ...> operator is invoked.  In TutrGeo3, we create and place 5 different tubes, using the data stored in the Fill statements.

However, you might notice in the browser that there are only two concrete instances of the tube being created.  What is going on here?  This is another feature of AgML.  When the shape is changed, AgML will look for another concrete volume with exactly the same shape.  If it finds it, it will use that volume.  If it doesn't, then a new volume is created.

There's alot going on in this tutorial, so play around a bit with it. 

 

Tutorial #4 -- Some more shapes

 

AgML vs AgSTAR Comparison

Abstract: We compare the AgML and AgSTAR descriptions of recent revisions of the STAR Y2005 through Y2011 geometry models.  We are specifically interested in the suitability of the AgML model for tracking.  We therefore plot the material contained in the TPC vs pseudorapidity for (a) all detectors, (b) the time projection chamber, and (c) the sensitive volumes of the time projection chamber.  We also plot (d) the material found in front of the TPC active volumes. 

Issues with PhmdGeo.xml
  • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

Decription of the Plots

Below you will find four columns of plots, for the highest revision of each geometry from y2005 to the present.  The columns from left-to-right show comparisons of the material budget for STAR and its daughter volumes, the material budgets for the TPC and it's immediate daughter volumes, the material budgets for the active volumes in the TPC, and the material in front of the active volume of the TPC.  In the context of tracking, the right-most column is the most important.

Each column contains three plots.  The top plot shows the material budget in the AgML model.  The middle plot, the material budget in the AgSTAR model.  The bottom plot shows the difference divided by the AgSTAR model.  The y-axis on the difference plot extends between -2.5% and +2.5%.

 --------------------------------


Attached you will find much more comprehensive set of plots, contained in a TAR file.  PDF's for every subsystem in each of the following geometry tags are provided.  They show the material budget comparing AgML to AgSTAR for every volume in the subsystem.  They also show a difference plot, equal to the difference divided by the average.  There is a color-coding scheme.  The volume will be coded green if the largest difference is below 1%, red if it exceeds 1% over an extended range.  Yellow indicates a missing (mis-named) volume in one or the other geometry, and orange indicates a 1% difference over a small area (likely the result of roundoff error in alignments of the geometries).


 

 

STAR Y2011 Geometry Tag

Issues with TpceGeo3a.xml
  • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
  • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
Issues with PhmdGeo.xml
  • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.
(a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

STAR Y2010c Geometry Tag

Issues with TpceGeo3a.xml
  • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
  • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
Issues with PhmdGeo.xml
  • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

 


(a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

STAR Y2009c Geometry Tag

Issues with TpceGeo3a.xml
  • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
  • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
Issues with PhmdGeo.xml
  • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.
(a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

STAR Y2008e Geometry Tag

Global Issues
  • Upstream areas not included in AgML steering routine.
Issues with TpceGeo3a.xml
  • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
  • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
Issues with PhmdGeo.xml
  • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.
(a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

STAR Y2007h Geometry Tag

Global Issues
  • Upstream areas not included in AgML steering routine.
Issues with TpceGeo3a.xml
  • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
  • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
Issues with PhmdGeo.xml
  • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

Issues with SVT.

(a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

STAR Y2006g Geometry Tag

Global Issues
  • Upstream areas not included in AgML steering routine.

Note: TpceGeo2.xml does not suffer from the overlap issue in TpceGeo3a.xml

(a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

STAR Y2005i Geometry Tag

Global Issues
  • Upstream areas not included in AgML steering routine.
Issues with TpceGeo3a.xml
  • TPA1 (tpc inner padrow) differences caused by overlap between TPA1 and a thin layer of G10 in the TPC. 
  • The padrow overlap is in the "prompt hits" region.  We do not (yet) use prompt hits in tracking.
Issues with PhmdGeo.xml
  • Two cases where the AgSTAR parser truncates a line in the phmdgeo.g file.  These should be fixed, but need to verify.

 

(a) Material in STAR Detector and daughters (b) Material in TPC and daughters (c) Material in TPC active volumes (d) Material in front of TPC active volumes

AgML vs AgSTAR tracking comparison

Attached is a comparison of track reconstruction using the Sti tracker, with AgI and AgML geometries as input.

Interfacing the New Detector with the STAR Big Full Chain

As STAR gradually comes to the end of its AA heavy ion program and more focus are put on polarized pp/pA physics and future ep/eA project at eRHIC era, many upgrades are foreseen to strengthen the detector capability at forward region. These include both the near-term upgrades for polarized pp program, eg. FMS/FSC/FHC calorimeters and FGT/VFGT tracking, and upgrades for eSTAR in about 5 to 10 years. Different detector concepts exist and optimization is needed to fit them in STAR physics and into the current STAR detector system. To reach a proper solution a lot of Monte Carlo (MC) works will be carried out, especially in the STAR simulation framework for its flexibility, robustness and proven performance during the last decade.

 
During the last 9 months, I have worked with the colleagues at BNL for developing a new detector concept for eSTAR east-side endcap upgrade to identify reliably the recoiled electrons in ep/eA collisions. This detector design consists of several parts and its functionality need be evaluated in the STAR simulation framework. This procedure, including implementing a new detector into the STAR system and then generating/analyzing the MC data, requires quite a few efforts and collaborations with software experts, at least for a “rookie” (like me, who usually analyzes data without many code development). For the purpose to better arrange my own understanding on this and provide a guide for peoples that will do similar jobs in the future, I write this note based on my experience. Many software experts helped me much in dealing all kinds of problems I met in this work. I hope this guide can also relief their burden, to some extent, from being frequently interrupted by new code developer like me (remember they are already over-occupied to maintain the STAR software environment). Since I’m still not a veteran, this simple note will not contain all pieces but only necessary parts, likely most suitable for beginners only.
 
- Ming Shao (USTC)
 
 
Assume you will work on RCF (RACF) because this is the best-maintained place for computing work at STAR. Normally you should work in the evaluation version, since you'll add and tune new detector models which are not part of the current STAR experiment. So please remember type ‘star eval’ before you start. You’d better also create a new directory for this.
 
> mkdir [your work path]
> cd [your work path]
> star eval
 
First you should get the STAR detector geometry description, and then add or modify your new detector. STAR has switched to a new Extensible Markup Language (XML), called AgML, to describe its detector system. This is done by
 
cvs co StarVMC/Geometry
cvs co StarVMC/StarGeometry
cvs co StarVMC/xgeometry
 
Then you should create a new directory in StarVMC/Geometry, like all other detectors, with its name representing your new detector (usually ‘XXXXGeo’). For example, I added a new detector geometry named ‘EiddGeo’ into this directory, which is intending to identify electrons (Electron ID Detector). In this directory (XXXXGeo), you can create and further modify your new detector geometry in AgML language. You can find an example - StarVMC/Geometry/BBcmGeo/BbcmGeo.xml, which contains very detailed in-line explanation of this language. You can copy and modify it for your new detector, just following the style. More details about AgML can be found at Jason’s webpage (Jason Webb is the AgML expert).
 
After you finish your modeling of your new detector, try compile it by type ‘cons’ TWICE in your base work directory (the directory containing the StarVMC directory).
 
> cons
> cons
 
Debug your code if compilation fails until it succeeds all the way to create geometry libraries in the .sl53_gcc432 directory). Then you can check if the geometry of new detector satisfies your expectation by plotting it out. This can be done by modify the macros in the StarVMC/Geometry and execute them in ROOT.
 
> root.exe (in the base work directory)
> .L StarVMC/Geometry/macros/viewStarGeometry.C
> viewStarGeometry.C(“XXXX”)
 
XXXX is a geometry tag. An exmaple is shown on the right. The geometry of a new detector - EIDD - is plotted along with the STAR-TPC. Other detectors and magnetic system are omitted.
 
For related macros modification Jason can provide help. Note: Jason must acknowledge your work so he can modify the geometry tag or create a new one for you.
 
Once the new detector geometry is fine you may want to run some simulation events, based on GEANT. Before you can really do MC simulation via STARSIM (formerly known as GSTAR), you should make sure you have instrumented sensitive detector and hit digitalization in your detector geometry. Then you can initialize STARSIM by type ‘starsim’ in your work directory. In STARSIM you can execute your kumac macro to generate some MC events. Kumac is PAW based macro, and for a simple test run (eg. with file name ‘testrun.kumac’) it may look like this:
 
MACRO testrun
 DETP geom devE         | devE is a tag for eSTAR simulation
 GEXE .$STAR_HOST_SYS/lib/xgeometry.so   | use local geometry
 GCLO all
 MODE All SIMU 1 | 0: no secondary; 1: default secondary; 2: save all secondary
 GKIN 1 6 2.0 2.0 -1.5 -1.5
gfile o test1.fzd
 TRIG 1
RETURN
 
This macro will generate 1 negative muon from the center of STAR detector with a transverse momentum 2GeV/c, to pseudo-rapidity -1.5. The azimuthal angle is randomly chosen in the range from 0 to 360 degree. The simulated event, with all hits created in the detector system, is then saved into file ‘test1.fzd’. Before you exit STARSIM, you can print out information of this simulation to check your new detector. For example, you can print all hits created in the detector system by
 
> gprin hits
 
The STARSIM manual can be found at this URL. A built-in help command in STARSIM can also help in some details. Just type ‘help’ in STARSIM and follow the help instruction. The section 14 (GEANT related commands), 15 (GSTAR user commands) and 16 (advance GEANT user interface) in the help may be especially important to read.
 
If you find the generated hits reasonable and want to go forward, you need check out and modify necessary codes in the 'pams' directory, where codes transferring GEANT hits from STARSIM to STAR compatible hit type, are contained. You need at your work directory do
 
> cvs co pams
 
The codes contained in 'pams/sim' are especially useful for simulation purpose. Several files at different locations are then to be added or changed. You need create your own hit type in pams/sim/idl directory, where all kinds of hit types are defined. The file name is usually g2t_XXX_hit.idl (XXX is a three character name representing your detector). If your hit type is similar to those already exists, you can also just use that hit type (so no new hit type is needed). For example, a new hit type ‘g2t_etr_hit.idl’ was created when I tried to add a new detector (EiddGeo) into STAR.
 
struct g2t_etr_hit {          /* G2t_etr_hit */
       long      id;         /* primary key */
       long      next_tr_hit_p;/* Id of next hit on same track */
       long      track_p;    /* Id of parent track */
       long      volume_id; /* STAR volume identification */
       float     de;         /* energy deposition at hit */
       float     ds;         /* path length within padrow */
       float     p[3];       /* local momentum */
       float     tof;        /* time of flight */
       float     x[3];       /* coordinate (Cartesian) */
       float     lgam;       /* ALOG10(GEKin/AMass) */
       float     length;     /* track length up to this hit */
       float     adc;        /* signal in ADC after digitization */
       float     pad;        /* hit pad position used in digitization */
       float     timebucket; /* hit time position -"- */
};
 
This hit type is basically the same as TPC hit type since their functionality is similar. If you want this hit be associated with a track, you need also modify the g2t_track.idl in the same directory. Just add two line in the g2t_track struct.
 
long      hit_XXX_p; /* Id of first XXX hit on track linked list */
and
long      n_XXX_hit; /* Nhits on XXX */
 
Several other files necessary to be changed are located at another directory pams/sim/g2t. These files are g2t_XXX.idl, g2t_XXX.F and g2t_volume_id.g (XXX is a three character name representing your detector). g2t_XXX.idl connects g2t_track and your hit type (g2t_XXX_hit.idl) to your detector and g2t_XXX.F implements the actual function. One can refer to codes from detectors with similar functionalities to write your code. For example, g2t_tpc.F is a tracking type detector with energy loss of the track, g2t_tof.F is for timing and g2t_emc.F deals with properties of calorimeter type detector. You should basically follow the style in these example files and just make necessary changes (mostly detector names) related to your detector. For g2t_XXX.idl, an example is
 
#include "PAM.idl"
#include "g2t_track.idl"
#include "g2t_XXX_hit.idl"
interface g2t_XXX : amiModule{ STAFCV_T call (inout g2t_track g2t_track,
                                                                                     out g2t_XXX_hit g2t_XXX_hit ); };
 
For g2t_XXX.F, there is one important line in this file ‘call G2R_GET_SYS ('XXXX','YYYY',Iprin,Idigi)’, where XXXX is your detector name already shown in your geometry description ‘XXXXGeo.xml’, and YYYY is the sensitive volume name in your geometry. For example, it’s ‘call G2R_GET_SYS ('EIDD','TABD',Iprin,Idigi)’ in g2t_etr.F, since the corresponding sensitive volume is ‘TABD’ in ‘EiddGeo.xml’.
 
The g2t_volume_id.g file must also be modified to identify the new detector sensitive volumes, in an unambiguous way. You need provide an unique volume_id for each sensitive volume based on the hit volume id in GEANT, contained in an array numbv. In GEANT, a touchable volume can be uniquely found from its volume architecture. This volume architecture is smartly stored in the array numbv. Unnecessary part of the architecture is omitted provided the sensitive volume can still be located unambiguously. Assume a volume architecture of A containing B, B containing C, C containing D1 and D2, both of which are sensitive. Then numbv only store the volume id of D1 and D2, ie. only 1 number, since A, B and C are the same for D1/D2. However, if B contains another sensitive volume D3 (parallel to C), numbv will contain 2 number for a hit so that D1/D2/D3 can be uniquely identified. You should use the numbers in numbv to form a final volume_id value, eg.
 
elseif (Csys=='etr') then
   sector  = MOD( (numbv(1)-1), 12 );   "Sectors count from 0 - 11"
   layer    =      (numbv(1)-1)/ 12;           "Layers count from 0 - 2"
   section = numbv(2) - 1;                         "Sections count from 0 - 29"
  volume_id = section + 100*layer + 10000*sector
 
Please note numbv array element start from 1, not 0. Refer to other volume_id’s in the g2t_volume_id.g file.
 
When these modifications are successfully done, you need re-compile once again, by just type
 
> cons
 
in your work directory. WARNING: when you compile pams with your new detector items for the first time, you’re likely to get errors like “.sl53_gcc432/obj/pams/sim/g2t/St_g2t_XXX_Module.cxx:2:31: error: St_g2t_XXX_Module.h: No such file or directory” (XXX is your new detector). If you try “cons” again, the errors may disappear and your compilation seems to end good. However, there might be hidden vulnerability in your compiled code, which may sometimes cause execution problem. So here is a trick – when you meet such errors, clean your previous compilation first, then do the following in correct order.
 
> cons +pams/sim
> cons +pams
> cons +StarVMC/Geometry
> cons
 
In this way you can get correctly compiled code.
  
From now on, when you generate your simulation events in STARSIM, the correct type of hits in your new detector will be saved in the GEANT output file. However, the GEANT file is not plainly readable. The code in STAR software framework to read these data is the St_geant_Maker, which is always contained in the StRoot modular. You should check it out from the library,
 
> cvs co StRoot/St_geant_Maker
 
The major file you need to modify is St_geant_Maker.cxx. Its header file St_geant_Maker.h can be often left as it is. The following changes are necessary.
 
Add a line ‘#include "g2t/St_g2t_XXX_Module.h"’ to the header part of the file (XXX represents your detector abbreviation, as defined in pams/sim/g2t). You needn’t worry about this header file since it is automatically generated when you compile pams.
 
Add a part of code to read the hits from your detector to STAR TDataSet. An example to add ‘etr’ hits is shown as below.
 
nhits = 0;
geant3 -> Gfnhit("EIDH","TABD", nhits);
if ( nhits > 0 )
 {
    St_g2t_etr_hit *g2t_etr_hit = new St_g2t_etr_hit("g2t_etr_hit",nhits);
    m_DataSet->Add(g2t_etr_hit);
    iRes = g2t_etr( g2t_track, g2t_etr_hit);
    if ( Debug() > 1 ) g2t_etr_hit->Print(0,10);
 }
 
Just replace ‘etr’ to your hit type. One attention must be paid to the line ‘geant3 -> Gfnhit("EIDH","TABD", nhits)’. Here EIDH and TABD represent the names of the detector and sensitive volume. However, the actual detector name EIDD is changed to EIDH. This is the protocol – replace the last character of the detector name to ‘H’ (this also means the first 3 characters of your detector name should differ from all other detectors to avoid ambiguity).
 
The data retrieved from the GEANT files should now be written out for further processing. As a first step, the simulation events are usually stored in StMcEvent, a STAR class dedicated for record MC events. You should start by checking out this class to your work directory.
 
> cvs co StRoot/StMcEvent
> cvs co StRoot/StMcEventMaker
 
The second calss ‘StMcEventMaker’, as its name suggested, is intended to write out all necessary information from GEANT simulation to StMcEvent.
 
There are several classes in the directory StRoot/StMcEvent for you to add and modify. The first 2 classes are your detector hit definition class and collector class, usually with names like StMcXXXHit and StMcXXXHitCollection, where XXX represents your detector. You can refer to other classes in StRoot/StMcEvent with similar function to your detector, or even just use an existing one if you feel it already contains all information you need. For myself, I create 4 new classes StMcEtrHit, StMcEtrHitCollection, StMcEtfHit and StMcEtfHitCollection, at the same time use 2 existing class StMcCalorimeterHit and StMcEmcHitCollection, to implement my EIDD detector.
 
Then you should add your hit collector to StMcEvent class. In StMcEvent.hh file, add a line
 
class StMcXXXHitCollection;
 
to the header part of this file. Then add your hit collector to StMcEvent class protected member
 
StMcXXXHitCollection* mXXXHits;
 
and corresponding ‘Get’ and ‘Set’ methods to public function member, respectively
 
StMcXXXHitCollection* XXXHitCollection() { return mXXXHits; } 
const StMcXXXHitCollection* XXXHitCollection() const { return mXXXHits; } 
void setXXXHitCollection(StMcXXXHitCollection*);
 
Again here XXX is for your new detector name. Next in StMcEvent.cc file, add include files
 
#include "StMcXXXHitCollection.hh"
#include "StMcXXXHit.hh"
 
to the header part. Add a initialization call
 
mXXXHits = new StMcXXXHitCollection();
 
in the function ‘void StMcEvent::makeColls()’. Implement the ‘Set’ method declared in the StMcEvent.hh file
 
void StMcEvent::setXXXHitCollection(StMcXXXHitCollection* val) 
{
    if (mXXXHits && mXXXHits!= val) delete mXXXHits;  
    mXXXHits = val;
}
 
You may also want to print out your hits in some cases to check if they are OK. So in
 
void StMcEvent::Print(Option_t *option) const
 
function, you need a line or more to do this job. Usually it looks like
 
PrintHeader(Name,name);
PrintHitCollection(Name,name);
 
where PrintHeader and PrintHitCollection are C++ macros defined in StMcEvent.cc file, and Name/name represent your detector name. There are more than one such macros so you can choose the one best suits your case.
The detector hits caused by charged particles are usually related to track class. For simulation events, this is StMcTrack. You may also want to modify this class to have your detector hits in. Similar to StMcEvent class, you need add your hit member in StMcTrack.hh file, as well as ‘Get’, ‘Set’, ‘Add’ and ‘Remove’ methods.
 
StPtrVecMcXXXHit  mXXXHits;
StPtrVecMcXXXHit& XXXHits() { return mXXXHits; }
const StPtrVecMcXXXHit& XXXHits() const { return mXXXHits; }
void setXXXHits(StPtrVecMcXXXHit&);
void addXXXHit(StMcXXXHit*);
void removeXXXHit(StMcXXXHit*);
 
Then implement them in StMcTrack.cc file. This is quite straightforward - just refer to other detector hit type in this file to see how to do it.
Besids, don’t forget clear the hits in the destructor StMcTrack::~StMcTrack() by adding a line
 
mXXXHits.clear();
 
Other methods you may want to implement are:
 
ostream& operator<<(ostream& os, const StMcTrack& t),
void StMcTrack::Print(Option_t *option) const
const StPtrVecMcHit *StMcTrack::Hits(StDetectorId Id) const
const StPtrVecMcCalorimeterHit *StMcTrack::CalorimeterHits(StDetectorId Id) const
 
Neither of them is difficult.
You may notice that a vector class type is used in the codes above which is not declared before - StPtrVecMcXXXHit. This is done in another class StMcContainer.hh. You should add several lines in this file in the following order.
 
class StMcXXXHit;
typedef vector<StMcXXXHit*> StSPtrVecMcXXXHit;
typedef vector<StMcXXXHit*> StPtrVecMcXXXHit;
typedef StPtrVecMcXXXHit::iterator StMcXXXHitIterator;
typedef StPtrVecMcXXXHit::const_iterator StMcXXXHitConstIterator;
 
Two other classes contains necessary definitions and links you should modify are StMcEventTypes.hh and StMcEventLinkDef.h. In StMcEventTypes.hh add two lines
 
#include "StMcXXXHit.hh"
#include "StMcXXXHitCollection.hh"
 
In StMcEventLinkDef.h, the following lines are to be added.
 
#pragma link C++ function operator<<(ostream&, const StMcXXXHit&);
#pragma link C++ typedef StSPtrVecMcXXXHit;
#pragma link C++ typedef StPtrVecMcXXXHit;
#pragma link C++ typedef StMcXXXHitIterator;
#pragma link C++ typedef StMcXXXHitConstIterator;
#pragma link C++ class vector<StMcXXXHit*>+;
#pragma link C++ class vector<StMcXXXHit*>+;
 
Now the basic structure of adding your new detector hits into StMcEvent class is accomplished. It’s time to modify the StMcEventMaker class for your new detector. In the header file you need add a Boolean member
 
Bool_t doUseXXX;              //!  
 
If you are adding a new detector of calorimeter type, you’re likely to add a method
 
void fillXXX(St_g2t_emc_hit*);  (if you just use emc hit type)
or   
void fillXXX(St_g2t_XXX_hit*);  (if you use your own calorimeter hit type)
 
Then in StMcEventMaker.cxx the following places are to be changed.
Initialize doUseEcl to kTRUE in class constructor.
In StMcEventMaker::Make() function, add
 
St_g2t_YYY_hit *g2t_XXX_hitTablePointer = (St_g2t_YYY_hit *) geantDstI("g2t_XXX_hit");
 
One should pay attention to the type St_g2t_YYY_hit. Here YYY is the hit type (g2t_YYY_hit.idl) you used in pams/sim/g2t_XXX.idl. YYY is not necessary the same as XXX, since you can use existing hit type (YYY) for your new detector (XXX).
 
Then retrieve the hits by
 
// XXX Hit Tables
g2t_YYY_hit_st *XXXHitTable = 0;
if (g2t_XXX_hitTablePointer)
XXXHitTable = g2t_XXX_hitTablePointer->GetTable();
if (Debug()) cerr << "Table g2t_XXX_hit found in Dataset " << geantDstI.Pwd()->GetName() << endl;
   else
      if (Debug()) cerr << "Table g2t_XXX_hit Not found in Dataset " << geantDstI.Pwd()->GetName() << endl;
 
Then fill the hits, either by AddHits(XXX,XXX,XXX) macro, or your own method fillXXX(St_g2t_emc_hit*) or fillXXX(St_g2t_XXX_hit*). Read carefully the methods if you use the existing StMcCalorimeterHit hit type.
Now you can try to compile StRoot by
 
> cons +StRoot
 
After all the codes above are successfully compiled, you can proceed to run your GEANT data through BFC chain to generate STAR data, such as .McEvent.root file. The BFC options you choose depends on which detectors you want to include. A simple option example can be
 
Debug,devE,agml,McEvent,NoSvtIt,NoSsdIt,Idst,Tree,logger,genvtx,tags,IdTruth,geantout,big,fzin,McEvOut
 
However, if you need full simulation of STAR TPC, you need add many more options such as tpcrs and related database.
 
Further process on McEvent should be similar to all other MC data based analysis, but with your own detector feature. One example you may refer to is the StMiniMcMaker. You can check it out from STAR class library and make modifications to suit your work. As an example, I plot the hit points generated by TRD (in the EIDD) and TPC with a MC negative muon track at fixed momentum and direction. It's shown on the right.
 
All work above described are based on simulation. If you want to further implement “real” data type StEvent, you need add or change the classes in StRoot/StEvent. StEvent data should generally base on real experiment, such as a beam test on your new detector prototype. However, there may be needs for it since the functionality of some STAR classes rely on StEvent rather than StMcEvent.
 
Similar to what you have done for StMcEvent, you need add StXXXHit and StXXXHitCollection classes and attach them to StEvent and other relevant classes. Other auxiliary classes such as StContainers, StEnumerations and StDetectorDefinitions also need your modification too. All these classes are contained in StRoot/StEvent directory.
 
You also need add your own detector maker under StRoot directory. A recent example is StEtrFastSimMaker, which is a simple maker to add the endcap TRD hits to StEvent for further process. It’s a fast simulation maker since more realistic makers should base on experimental data. Victor add this maker just to test Stv track finding with endcap TRD - a more complicated work for experts only.
 

 

List of Default AgML Materials

List of default AgML materials and mixtures.  To get a complete list of all materials defined in a geometry, execute AgMaterial::List() in ROOT, once the geometry has been created.

[-]             Hydrogen:  a=     1.01 z=        1 dens=    0.071 radl=      865 absl=      790 isvol= <unset>  nelem=        1
[-]            Deuterium:  a=     2.01 z=        1 dens=    0.162 radl=      757 absl=      342 isvol= <unset>  nelem=        1
[-]               Helium:  a=        4 z=        2 dens=    0.125 radl=      755 absl=      478 isvol= <unset>  nelem=        1
[-]              Lithium:  a=     6.94 z=        3 dens=    0.534 radl=      155 absl=      121 isvol= <unset>  nelem=        1
[-]            Berillium:  a=     9.01 z=        4 dens=    1.848 radl=     35.3 absl=     36.7 isvol= <unset>  nelem=        1
[-]               Carbon:  a=    12.01 z=        6 dens=    2.265 radl=     18.8 absl=     49.9 isvol= <unset>  nelem=        1
[-]             Nitrogen:  a=    14.01 z=        7 dens=    0.808 radl=     44.5 absl=     99.4 isvol= <unset>  nelem=        1
[-]                 Neon:  a=    20.18 z=       10 dens=    1.207 radl=       24 absl=     74.9 isvol= <unset>  nelem=        1
[-]            Aluminium:  a=    26.98 z=       13 dens=      2.7 radl=      8.9 absl=     37.2 isvol= <unset>  nelem=        1
[-]                 Iron:  a=    55.85 z=       26 dens=     7.87 radl=     1.76 absl=     17.1 isvol= <unset>  nelem=        1
[-]               Copper:  a=    63.54 z=       29 dens=     8.96 radl=     1.43 absl=     14.8 isvol= <unset>  nelem=        1
[-]             Tungsten:  a=   183.85 z=       74 dens=     19.3 radl=     0.35 absl=     10.3 isvol= <unset>  nelem=        1
[-]                 Lead:  a=   207.19 z=       82 dens=    11.35 radl=     0.56 absl=     18.5 isvol= <unset>  nelem=        1
[-]              Uranium:  a=   238.03 z=       92 dens=    18.95 radl=     0.32 absl=       12 isvol= <unset>  nelem=        1
[-]                  Air:  a=    14.61 z=      7.3 dens= 0.001205 radl=    30400 absl=    67500 isvol= <unset>  nelem=        1
[-]               Vacuum:  a=    14.61 z=      7.3 dens=    1e-06 radl= 3.04e+07 absl= 6.75e+07 isvol= <unset>  nelem=        1
[-]              Silicon:  a=    28.09 z=       14 dens=     2.33 radl=     9.36 absl=     45.5 isvol= <unset>  nelem=        1
[-]            Argon_gas:  a=    39.95 z=       18 dens=    0.002 radl=    11800 absl=    70700 isvol= <unset>  nelem=        1
[-]         Nitrogen_gas:  a=    14.01 z=        7 dens=    0.001 radl=    32600 absl=    75400 isvol= <unset>  nelem=        1
[-]           Oxygen_gas:  a=       16 z=        8 dens=    0.001 radl=    23900 absl=    67500 isvol= <unset>  nelem=        1
[-]           Polystyren:  a=   11.153 z=    5.615 dens=    1.032 radl= <unset>  absl= <unset>  isvol= <unset>  nelem=        2
                                                                                           A           Z         W
                                                                                    C   12.000      6.000     0.923
                                                                                    H    1.000      1.000     0.077
[-]         Polyethylene:  a=   10.427 z=    5.285 dens=     0.93 radl= <unset>  absl= <unset>  isvol= <unset>  nelem=        2
                                                                                           A           Z         W
                                                                                    C   12.000      6.000     0.857
                                                                                    H    1.000      1.000     0.143
[-]                Mylar:  a=    12.87 z=    6.456 dens=     1.39 radl= <unset>  absl= <unset>  isvol= <unset>  nelem=        3
                                                                                           A           Z         W
                                                                                    C   12.000      6.000     0.625
                                                                                    H    1.000      1.000     0.042
                                                                                    O   16.000      8.000     0.333

Production Geometry Tags

This page was merged with STAR Geometry in simulation & reconstruction and maintained by STAR's librarian.

 

 

Attic

Retired Simulation Pages kept here.

Action Items

Immediate action items:

  •  Y2008 tag
    • find out about the status of the FTPC (can't locate the relevant e-mail now)
    • find out about the status of PMD in 2008 (open/closed)
    • ask Akio about possible updates of the FMS code, get the final version
    • based on Dave's records, add a small amount of material to the beampipe
    • review the tech drawings from Bill and Will and others and start coding the support structure
    • extract information from TOF people about the likely configuration
    • when ready, produce the material profile plots for Y2008 in slices in Z
  • TUP tags
    • work with Jim Thomas, Gerrit and primarily Spiros on the definition of geometry for the next TUP wave
    • coordinate with Spiros, Jim and Yuri a possible repass of the trecent TUP MC data without the IST
  • Older tags
    • check the more recent correction to the SVT code (carbon instead of Be used in the water channels)
    • provide code for the correction for 3 layers of mylar on the beampipe as referred to above in Y2008
    • check with Dave about the dimensions of the water channels (likely incorrect in GEANT)
    • determine which years we will choose to retrofit with improved SVT (ask STAR members)
  • MTD
    • Establish a new UPGRXX tag for the MTD simulation
    • supervise and help Lijuan in extending the filed map
    • provide facility for reading a separate map in starsim and root4star (with Yuri)
  • Misc
    • collect feedback on possible simulation plans for the fall'07
    • revisit the codes for event pre-selection ("hooks")
    • revisit the event mixing scripts
  • Development
    • create a schema to store MC run catalog data with a view to automate job definition (Michael has promised help)

 

Beampipe support geometry and other news

Documentation for the beampipe support geometry description development

After the completion of the 2007 run, the SVT and the SSD were removed from the STAR detector along with there utility lines. The support structure for the beampipe remained, however.

The following drawings describe the structure of the beampipe support as it exists in the late 2007 and probably throughout 2008

Further corrections to the SVT geometry model
 
In the course of recent discussion of the beampipe support and shield material, Dave Lynn has found that even though according to the plans, the material of the cooling water channels in the SVT was specified as Be, in reality carbon composite material was used for that purpose. Below, there are materil vs pseudorapidity plots for the "old" and "new" codes
 
 
 
It can be seen that the difference is of the order of 0.4% rad. length on top of the existing (roughly) 6%. This is enough grounds for cutting a new version of the geometry and will shall create a geometry tag Y2007A which will reflect such change.
 

Datasets

Here we present information about our datasets.

2005

Description
Dataset name
Statistics, thousands
Status
Moved to HPSS
Comment
Herwig 6.507, Y2004Y
rcf1259
225
Finished
Yes
  7Gev<Pt<9Gev
Herwig 6.507, Y2004Y
rcf1258
248
Finished
Yes
  5Gev<Pt<7Gev
Herwig 6.507, Y2004Y
rcf1257
367
Finished
Yes
  4Gev<Pt<5Gev
Herwig 6.507, Y2004Y
rcf1256
424
Finished
Yes
  3Gev<Pt<4Gev
Herwig 6.507, Y2004Y
rcf1255
407
Finished
Yes
  2Gev<Pt<3Gev
Herwig 6.507, Y2004Y
rcf1254
225
Finished
Yes
  35Gev<Pt<100Gev
Herwig 6.507, Y2004Y
rcf1253
263
Finished
Yes
  25Gev<Pt<35Gev
Herwig 6.507, Y2004Y
rcf1252
263
Finished
Yes
  15Gev<Pt<25Gev
Herwig 6.507, Y2004Y
rcf1251
225
Finished
Yes
  11Gev<Pt<15Gev
Herwig 6.507, Y2004Y
rcf1250
300
Finished
Yes
  9Gev<Pt<11Gev
Hijing 1.382 AuAu 200 GeV minbias, 0< b < 20fm
rcf1249
24
Finished
Yes
Tracking,new SVT geo, diamond: 60, +-30cm, Y2005D
Herwig 6.507, Y2004Y
rcf1248
15
Finished
Yes
35Gev<Pt<45Gev
Herwig 6.507, Y2004Y
rcf1247
25
Finished
Yes
25Gev<Pt<35Gev
Herwig 6.507, Y2004Y
rcf1246
50
Finished
Yes
15Gev<Pt<25Gev
Herwig 6.507, Y2004Y
rcf1245
100
Finished
Yes
11Gev<Pt<15Gev
Herwig 6.507, Y2004Y
rcf1244
200
Finished
Yes
  9Gev<Pt<11Gev
CuCu 62.4 Gev, Y2005C
rcf1243
5
Finished
No
same as 1242+ keep Low Energy Tracks
CuCu 62.4 Gev, Y2005C
rcf1242
5
Finished
No
SVT tracking test, 10 keV e/m process cut (cf. rcf1237)
10 J/Psi, Y2005X, SVT out
rcf1241
30
Finished
No
Study of the SVT material effect
10 J/Psi, Y2005X, SVT in
rcf1240
30
Finished
No
Study of the SVT material effect
100 pi0, Y2005X, SVT out
rcf1239
18
Finished
No
Study of the SVT material effect
100 pi0, Y2005X, SVT in
rcf1238
20
Finished
No
Study of the SVT material effect
CuCu 62.4 Gev, Y2005C
rcf1237
5
Finished
No
SVT tracking test, pilot run
Herwig 6.507, Y2004Y
rcf1236
8
Finished
No
Test run for initial comparison with Pythia, 5Gev<Pt<7Gev
Pythia, Y2004Y
rcf1235
100
Finished
No
MSEL=2, min bias
Pythia, Y2004Y
rcf1234
90
Finished
No
MSEL=0,CKIN(3)=0,MSUB=91,92,93,94,95
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1233
308
Finished
Yes
4<Pt<5, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1232
400
Finished
Yes
3<Pt<4, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1231
504
Finished
Yes
2<Pt<3, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1230
104
Finished
Yes
35<Pt, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1229
208
Finished
Yes
25<Pt<35, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1228
216
Finished
Yes
15<Pt<25, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1227
216
Finished
Yes
11<Pt<15, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1226
216
Finished
Yes
9<Pt<11, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1225
216
Finished
Yes
7<Pt<9, MSEL=1, GHEISHA
Pythia, Y2004Y, sp.2
(CDF tune A)
rcf1224
216
Finished
Yes
5<Pt<7, MSEL=1, GHEISHA
Pythia special tune2
Y2004Y, GCALOR
rcf1223
100
Finished
Yes
4<Pt<5, GCALOR
Pythia special tune2
Y2004Y, GHEISHA
rcf1222
100
Finished
Yes
4<Pt<5, GHEISHA
Pythia special run 3
Y2004C
rcf1221
100
Finished
Yes
ENER 200.0, MSEL 2, MSTP (51)=7,
MSTP (81)=1, MSTP (82)=1, PARP (82)=1.9,
PARP (83)=0.5, PARP (84)=0.2,
PARP (85)=0.33, PARP (86)=0.66,
PARP (89)=1000, PARP (90)=0.16,
PARP (91)=1.0, PARP (67)=1.0
Pythia special run 2
Y2004C
(CDF tune A)
rcf1220
100
Finished
Yes

ENER 200.0, MSEL 2, MSTP (51)=7,
MSTP (81)=1, MSTP (82)=4, PARP (82)=2.0,
PARP (83)=0.5, PARP (84)=0.4,
PARP (85)=0.9, PARP (86)=0.95,
PARP (89)=1800, PARP (90)=0.25,
PARP (91)=1.0, PARP (67)=4.0

Pythia special run 1
Y2004C
rcf1219
100
Finished
Yes
ENER 200.0, MSEL 2, MSTP (51)=7,
MSTP (81)=1, MSTP (82)=1, PARP (82)=1.9, PARP (83)=0.5, PARP (84)=0.2,
PARP (85)=0.33, PARP (86)=0.66,
PARP (89)=1000, PARP (90)=0.16,
PARP (91)=1.5, PARP (67)=1.0
Hijing 1.382 AuAu 200 GeV central
0< b < 3fm
rcf1218
50
Finished
Yes
Statistics enhancement of rcf1209 with
a smaller diamond: 60, +-30cm, Y2004a
Hijing 1.382 CuCu 200 GeV minbias
0< b < 14 fm
rcf1216
52
Finished
Yes
Geometry: Y2005x
Hijing 1.382 AuAu 200 GeV minbias
0< b < 20 fm
rcf1215
100
Finished
Yes
Geometry: Y2004a, Special D decays

2006

Description
Dataset name
Statistics, thousands
Status
Moved to HPSS
Comment
AuAu 200 GeV central

rcf1289

1

Finished

No

upgr06: Hijing, D0 and superposition
AuAu 200 GeV central

rcf1288

0.8

Finished

No

upgr11: Hijing, D0 and superposition
AuAu 200 GeV min bias

rcf1287

5

Finished

No

upgr11: Hijing, D0 and superposition
AuAu 200 GeV central

rcf1286

1

Finished

No

upgr10: Hijing, D0 and superposition
AuAu 200 GeV min bias

rcf1285

6

Finished

No

upgr10: Hijing, D0 and superposition
AuAu 200 GeV central

rcf1284

1

Finished

No

upgr09: Hijing, D0 and superposition
AuAu 200 Gev min bias

rcf1283

6

Finished

No

upgr09: Hijing, D0 and superposition
AuAu 200 GeV min bias

rcf1282

38

Finished

No

upgr06: Hijing, D0 and superposition
AuAu 200 GeV min bias

rcf1281

38

Finished

Yes

upgr08: Hijing, D0 and superposition
AuAu 200 GeV min bias

rcf1280

38

Finished

Yes

upgr01: Hijing, D0 and superposition
AuAu 200 GeV min bias

rcf1279

38

Finished

Yes

upgr07: Hijing, D0 and superposition
Extension of 1276: D0 superposition
rcf1278
5
Finished

No

upgr07: Z cut=+-300cm
AuAu 200 GeV min bias
rcf1277
Finished
No
upgr05: Z cut=+-300cm
AuAu 200 GeV min bias
rcf1276
35
Finished
No
upgr05: Hijing, D0 and superposition
Pythia 200 GeV + HF
rcf1275
23*4
Finished
No
J/Psi and Upsilon(1S,2S,3S) mix for embedding
AuAu 200 GeV min bias
rcf1274
10
Finished
No
upgr02 geo tag, |eta|<1.5 (tracking upgrade request)
Pythia 200 GeV
rcf1273
600
Finished
Yes
Pt <2 (Completing the rcf1224-1233 series)
CuCu 200 GeV min bias+D0 mix
rcf1272
50+2*50*8
Finished
Yes
Combinatorial boost of rcf1261, sigma: 60, +-30
Pythia 200 GeV
rcf1233
300
Finished
Yes
4< Pt <5 (rcf1233 extension)
Pythia 200 GeV
pds1232
200
Finished
Yes
3< Pt <4 (rcf1232 clone)
Pythia 200 GeV
pds1231
240
Finished
Yes
2< Pt <3 (rcf1231 clone)
Pythia 200 GeV
rcf1229
200
Finished
Yes
25< Pt <35 (rcf1229 extension)
Pythia 200 GeV
rcf1228
200
Finished
Yes
15< Pt <25 (rcf1228 extension)
Pythia 200 GeV
rcf1227
208
Finished
Yes
11< Pt <15 (rcf1227 extension)
Pythia 200 GeV
rcf1226
200
Finished
Yes
9< Pt <11 (rcf1226 extension)
Pythia 200 GeV
rcf1225
200
Finished
Yes
7< Pt <9 (rcf1225 extension)
Pythia 200 GeV
rcf1224
212
Finished
Yes
5< Pt <7 (rcf1224 extension)
Pythia 200 GeV Y2004Y CDF_A
rcf1271
120
Finished
Yes
55< Pt <65
Pythia 200 GeV Y2004A CDF_A
rcf1270
120
Finished
Yes
45< Pt <55
CuCu 200 GeV min bias
rcf1266
10
Finished
Yes
SVT study: clams and two ladders
CuCu 200 GeV min bias
rcf1265
10
Finished
Yes
SVT study: clams displaced
CuCu 200 GeV min bias
rcf1264
10
Finished
Yes
SVT study: rotation of the barrel
CuCu 62.4 GeV min bias+D0 mix
rcf1262
50*3
Finished
Yes
3 subsets: Hijing, single D0, and the mix
CuCu 200 GeV min bias+D0 mix
rcf1261
50*3
Finished
No
3 subsets: Hijing, single D0, and the mix
1 J/Psi over 200GeV minbias AuAu
rcf1260
10
Finished
No
J/Psi mixed with 200GeV AuAu Hijing Y2004Y 60/35 vertex

2007

Unless stated otherwise, all pp collisions are modeled with Pythia, and all AA collisions with Hijing. Statistics is listed in thousands of events. Multiplication factor in some of the records refelcts the fact that event mixing was done for a few types of particles, on the same base of original event files.

Name System/Energy Statistics Status HPSS Comment Site
rcf1290 AuAu200 0<b<3fm, Zcut=5cm 32*5 Done Yes Hijing+D0+Lac2+D0_mix+Lac2_mix rcas
rcf1291 pp200/UPGR07/Zcut=10cm 10 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
rcf1292 pp500/UPGR07/Zcut=10cm 10 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
rcf1293 pp200/UPGR07/Zcut=30cm 205 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
rcf1294 pp500/UPGR07/Zcut=30cm 10 Done Yes ISUB = 11, 12, 13, 28, 53, 68 rcas
rcf1295 AuAu200 0<b<20fm, Zcut=30cm 20 Done Yes QA run for the Y2007 tag rcas
rcf1296 AuAu200 0<b<3fm, Zcut=10cm 100*5 Done Yes Hijing,B0,B+,B0_mix,B+_mix, Y2007 rcas
rcf1297 AuAu200 0<b<20fm, Zcut=300cm 40 Done Yes Pile-up simulation in the TUP studies, UPGR13 rcas
rcf1298 AuAu200 0<b<3fm, Zcut=15cm 100*5 Done Part Hijing,D0,Lac2,D0_mix,Lac2_mix, UPGR13 rcas
rcf1299 pp200/Y2005/Zcut=50cm 800 Done Yes Pythia, photon mix, pi0 mix rcas
rcf1300 pp200/UPGR13/Zcut=15cm 100 Done No Pythia, MSEL=4 (charm) rcas
rcf1301 pp200/UPGR13/Zcut=300cm 84 Done No Pythia, MSEL=1, wide vertex rcas
rcf1302 pp200 Y2006C 120 Done No Pythia for Spin PWG, Pt(45,55)GeV rcas
rcf1303 pp200 Y2006C 120 Done No Pythia for Spin PWG, Pt(35,45)GeV rcas
rcf1304 pp200 Y2006C 120 Done No Pythia for Spin PWG, Pt(55,65)GeV rcas
rcf1296 Upsilon S1,S2,S3 + Hijing 15*3 Done No Muon Telescope Detector, ext.of 1296 rcas
rcf1306 pp200 Y2006C 400 Done Yes Pythia for Spin PWG, Pt(25,35)GeV rcas
rcf1307 pp200 Y2006C 400 Done Yes Pythia for Spin PWG, Pt(15,25)GeV rcas
rcf1308 pp200 Y2006C 420 Done Yes Pythia for Spin PWG, Pt(11,15)GeV rcas
rcf1309 pp200 Y2006C 420 Done Yes Pythia for Spin PWG, Pt(9,11)GeV rcas
rcf1310 pp200 Y2006C 420 Done Yes Pythia for Spin PWG, Pt(7,9)GeV rcas
rcf1311 pp200 Y2006C 400 Done Yes Pythia for Spin PWG, Pt(5,7)GeV rcas
rcf1312 pp200 Y2004Y 544 Done No Di-jet CKIN(3,4,7,8,27,28)=7,9,0.0,1.0,-0.4,0.4 rcas
rcf1313 pp200 Y2004Y 760 Done No Di-jet CKIN(3,4,7,8,27,28)=9,11,-0.4,1.4,-0.5,0.6 rcas
rcf1314 pp200 Y2004Y 112 Done No Di-jet CKIN(3,4,7,8,27,28)=11,15,-0.2,1.2,-0.6,-0.3 Grid
rcf1315 pp200 Y2004Y 396 Done No Di-jet CKIN(3,4,7,8,27,28)=11,15,-0.5,1.5,-0.3,0.4 Grid
rcf1316 pp200 Y2004Y 132 Done No Di-jet CKIN(3,4,7,8,27,28)=11,15,0.0,1.0,0.4,0.7 Grid
rcf1317 pp200 Y2006C 600 Done Yes Pythia for Spin PWG, Pt(4,5)GeV Grid
rcf1318 pp200 Y2006C 690 Done Yes Pythia for Spin PWG, Pt(3,4)GeV Grid
rcf1319 pp200 Y2006C 690 Done Yes Pythia for Spin PWG, Minbias Grid
rcf1320 pp62.4 Y2006C 400 Done No Pythia for Spin PWG, Pt(4,5)GeV Grid
rcf1321 pp62.4 Y2006C 250 Done No Pythia for Spin PWG, Pt(3,4)GeV Grid
rcf1322 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(5,7)GeV Grid
rcf1323 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(7,9)GeV Grid
rcf1324 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(9,11)GeV Grid
rcf1325 pp62.4 Y2006C 220 Done No Pythia for Spin PWG, Pt(11,15)GeV Grid
rcf1326 pp62.4 Y2006C 200 Running No Pythia for Spin PWG, Pt(15,25)GeV Grid
rcf1327 pp62.4 Y2006C 200 Running No Pythia for Spin PWG, Pt(25,35)GeV Grid
rcf1328 pp62.4 Y2006C 50 Running No Pythia for Spin PWG, Pt(35,45)GeV Grid

2009

 

qqqqqqqqqqqqqqqqqqqqqqqqqqq
Name   SystemEnergy
Range   Statistics  Comment
 rcf9001  pp200, y2007g 03_04gev  690k  Jet Study AuAu200(PP200) JLC PWG
 rcf9002   04_05gev  686k  
 rcf9003   05_07gev  398k  
 rcf9004   07_09gev  420k  
 rcf9005   09_11gev  412k  
 rcf9006   11_15gev  420k  
 rcf9007   15_25gev  397k  
 rcf9008   25_35gev  400k  
 rcf9009   35_45gev  120k  
 rcf9010   45_55gev  118k  
 rcf9011   55_65gev  120k  
         

 

  Name   SystemEnergy  Range  Statistics        Comment
 rcf9021 pp200,y2008       03_04 GeV  690k  Jet Study AuD200(PP200) JLC PWG
 rcf9022    04_05 GeV  686k  
 rcf9023    05_07 GeV  398k  
 rcf9024    07_09 GeV  420k  
 rcf9025    09_11 GeV  412k  
 rcf9026    11_15 GeV  420k  
 rcf9027    15_25 GeV  397k  
 rcf9028    25_35 GeV  400k  
 rcf9029    35_45 GeV  120k  
 rcf9030    45_55 GeV  118k  
 rcf9031    55_99 GeV  120k  

 

 Name  SystemEnergy   Range   Statistics      Comment  
 rcf9041    PP500, Y2009  03_04gev  500k Spin Study PP500 Spin group(Matt,Jim,Jan) 2.3M evts
 rcf9042   04_05gev  500k  
 rcf9043   05_07gev  300k  
 rcf9044   07_09gev  250k  
 rcf9045   09_11gev  200k  
 rcf9046   11_15gev  100k  
 rcf9047   15_25gev  100k  
 rcf9048   25_35gev  100k  
 rcf9049   35_45gev  100k  
 rcf9050   45_55gev    25k  
 rcf9051   55_99gev    25k  
         
 rcf9061  CuCu200,y2005h  B0_14  200k CuCu200 radiation length budget, Y.Fisyak, KyungEon Choi.
 rcf9062 AuAu200, y2007h  B0_14  150k AuAu200 radiation length budget  Y.Fisyak ,KyungEon Choi

 

2010

Information on Monte Carlo Data Samples

 
You do not have access to view this node
 
Geometry y2009a
Library SL09g
Generator Pythia 6.4.22
Tune 320
Field -5.0
ETA -10 < η < +10
PHI -π < φ < +π
vertex 0, 0, -2
width 0.015, 0.015, 42.0
  
Sample  Channel Events
rcf10000 W+ → e+ nu 10k
rcf10001 W- → e- nu 6k
rcf10002 W+ → tau+ nu
W- → tau- nu
10k
rcf10003 pp → W+/- + jet 10k
rcf10004 Z e+e-, no Z/gamma interference 4k
rcf10005 Z all but e+e- 10k
rcf10006 QCD w/ partonic pT > 35 GeV 100k

 

Geometry Tag Options

 This page documents the options in geometry.g which define each of the production tags.

 This page documents the options in geometry.g which define each of the production tags.

 This page documents the options in geometry.g which define each of the production tags.

 This page documents the options in geometry.g which define each of the production tags.

 This page documents the options in geometry.g which define each of the production tags.

 This page documents the options in geometry.g which define each of the production tags.

Geometry Tag Options II

The attached spreadsheets document the production tags in STARSIM on 11/30/2009.  At that time the y2006h and y2010 tags were in development and not ready for production.

Material Balance Histograms

.

Y2008a

 y2008a full and TPC only material histograms

 

y2008aStar

 

1 2

 

y2008aTpce

 

 

y2005g

 

 .

 

y2005gStar

 

2
3

 

y2005gTpce

 

 

y2008yf

.

y2008yfStar

 

111
  `

 

y2008yfTpce

 

1

 

y2009

.

y2009Star

 

.
.

 

y2009Tpce

 

.

 

STAR AgML Geometry Comparison with STARSIM/AgSTAR

STAR Geometry Comparison: AgML vs AgSTAR

At the left is a general status for each geometry tag which compiles in AgML.  All volumes are tested recursively except for the "IBEM" and similar support structures for the VPD, and the Endcap SMD strips.  (The ESMD planes are tested as a unit, rather than test all 2*12*288 SMD strips).

Color codes:

Green: No differences larger than 1%

Yellow: The volume did not appear in AgSTAR geometry

Orange: Difference was larger than 1%, but absolute difference is absolutely negligible.

Red: A difference larger than 1% was detected for a significant amount of material; or a negligible but widespread difference was detected. 

At the right is a PDF file for each geometry tag. For each volume we show two plots. The top plot shows the absolute number of radiation lengths which a geantino encounters traversing the geometry, starting at the geometry and following a straight line at the given pseudorapidity. We average over all phi. The left (right) hashes show the AgML (AgSTAR) geometry. The difference (expressed as a fractional value) of the two histograms is shown the lower plot. Frequently the differences are small, e.g. 10^-6, and ROOT rescales the plots accordingly. Since it is difficult to read the scales of so many plots at once, we have color coded the plots. (Coding seems to fail in the generation of some histograms)... The meaning of the color coding is summarized below.

<?php
/********************************************************************** START OF PHP */

/* =======================================================
   Helper function to show the status_yXXXX.png
   ======================================================= */
function showImage( $tag, $dir ) {

   echo
"<img src=\"$dir/status_$tag.png\" />";

}

/* =======================================================
   Helper function to show the PDF file
   ======================================================= */
function showGoogle( $tag, $dir ) {
  
/*
   echo "<iframe border=\"0\" url=\"http://docs.google.com/gview?url=$dir$tag.pdf&amp;embedded=true\" style=\"width: 562px; height: 705px;\"> </iframe>";
   */

echo

"<iframe frameborder=\"0\" style=\"width: 562px; height: 705px;\" src=\"http://docs.google.com/gview?url=$dir/$tag.pdf&amp;embedded=true\"></iframe>"

;
}


/* =======================================================
   First some PHP input... find the date of the comparison
   ======================================================= */
$YEAR="2011";
$DATE="06-15-2011";
$DIR="http://www.star.bnl.gov/~jwebb/".$YEAR."/".$DATE."/AgML-Comparison/";
$TAGS=$DIR."TAGS";
/* =======================================================
   Output header for this page
   ======================================================= */

echo "<h3>STAR AgML vs AgSTAR Comparison on ".$DATE."</h3>";

/* =======================================================
   Read in each line in the TAGs file
   ======================================================= */
$handle = @fopen("$TAGS", "r");
if (
$handle) {
    while ((
$buffer = fgets($handle, 4096)) !== false) {

       
/* Trim the whitespace out of the string */
       
$buffer=trim($buffer);

       
/* Draw an HRULE and specify which geometry tag we are using */
       
echo "<hr><p>STAR Geometry Tag $buffer</p>";

       
/* Now build a 2-entry table with the status PNG on the left
           and the summary PDF ala google docs on the right */

       
showImage( $buffer, $DIR );

       
showGoogle( $buffer, $DIR );

    }
    if (!
feof($handle)) {
        echo
"Error: unexpected fgets() fail\n";
    }
   
fclose($handle);
}

/************************************************************************ END OF PHP */
?>

STAR AgML Language Reference

STAR Geometry Page

R&D Tags

The R&D conducted for the inner tracking upgrade required that a few specialized geometry tags be created. For a complete set of geometry tags, please visit the STAR Geometry in simulation & reconstruction page. The below serves as additional documentation and details.

Taxonomy:

  • SSD: Silicon strip detector
  • IST: Inner Silicon Tracker
  • HFT: Heavy Flavor Tracker
  • IGT: Inner GEM Tracker
  • HPD: Hybrid Pixel Detector

The TPC is present in all configuration listed below and the SVT is in none.

   Tag    

 SSD IST HFT IGT HPD Contact Person  Comment

UPGR01

+

 

+

 

 

   

UPGR02

 

+

+

 

 

   

UPGR03

 

+

+

+

 

   

UPGR04

+

 

 

 

+

 Sevil

retired

UPGR05

+

+

+

+

+

 Everybody

retired

UPGR06

 +

 

+

 

+

 Sevil

retired

UPGR07

 +

+

+

+

 

Maxim

 

UPGR08

 

+

+

+

+

Maxim

 

UPGR09

 

+

+

 

+

Gerrit

retired  Outer IST layer only

UPGR10

+

+

+

   

Gerrit

Inner IST@9.5cm

UPGR11

+

+

+

 

 

Gerrit

IST @9.5&@17.0

UPGR12

+

+

+

+

+

Ross Corliss

retired  UPGR05*diff.igt.radii

UPGR13

+

+

+

+

 

Gerrit

UPGR07*(new 6 disk FGT)*corrected SSD*(no West Cone)
UPGR14  +    +  +    Gerrit  UPGR13 - IST  
UPGR15 + + +     Gerrit  Simple Geometry for testing, Single IST@14cm, hermetic/polygon Pixel/IST geometry. Only inner beam pipe 0.5mm Be. Pixel 300um Si, IST 1236umSi  
UPGR20  +         Lijuan   Y2007 + one TOF  
UPGR21   +         Lijuan   UPGR20 + full TOF  

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Eta coverage of the SSD and HFT at different vertex spreads:

Z cut, cm

eta SSD eta HFT

5

1.63 

2.00

10

1.72

2.10

20

1.87

2.30

30

2.00

2.55

 

 

 

 

 

 

Material balance studies for the upgrade: presented below are the usual radiation length plots (as a function of rapidity).

 

Full UPGR05:

 

 

Forward region: the FST and the IGT ONLY:

 

 

Below, we plot the material for each individual detector, excluding the forward region to reduce ambiguity.

 

SSD:

 

 

IST:

 

 

HPD:

 

 

HFT:

 

Event Filtering

The attached PDF describes event filtering in the STAR framework.

Event Generators

Event Generator Framework
Example macros for running event generators + starsim in ROOT:
$ cvs co StRoot/StarGenerator/macros
  • starsim.pythia6.C
  • starsim.pythia8.C
  • starsim.hijing.C
  • starsim.herwig.C
  • starsim.pepsi.C
  • starsim.starlight.C
  • starsim.kinematics.C
To run an example macro and generate 100 events:
$ ln -s StRoot/StarGenerator/macros/starsim.pythia8.C starsim.C
$ root4star -q -b starsim.C\(100\)
This will generate two files.  A standard "fzd" file, which can be reconstructed using the big "full" chain (bfc.C).  And a root file, containing a TTree expressing the event record for the generated events.

The new Event Record

The event-wise and particle-wise information from event generators is saved in a ROOT/TTree.  The TTree can be read in in sync with the MuDst when you perform your analysis.  The ID truth values in the reconstructed tracks in the MuDst can be compared to the primary key of the tracks in the event record to identify generator tracks which were reconstructed by the tracker.

The event record can be browsed using the standard ROOT ttree viewer.  Example for pythia 8:
root [0] TFile::Open("pythia8.starsim.root")
root [1] genevents->StartViewer()
root [2] genevents->Draw("mMass","mStatus>0")
The event record contains both particle-wise and event-wise information.  For the definitions of different quantities, see the documentation provided in the StarGenEvent links above.



Adding a new Generator

Event generators are responsible for creating the final state particles which are fed out to GEANT for simulation.  They can be as simple as particle guns, shooting individual particles along well defined trajectories, or complex hydrodynamical models of heavy-ion collisions.  Regardless of the complexities of the underlying physical model, the job of an event generator in the STAR framework is to add particles to an event record.  In this document we will describe the steps needed to add a new event generator to the STAR framework.  The document will be divided into three sections: (1)  An overview, presenting the general steps which are required; (2) A FORtran-specific HOWTO, providing guidance specific to the problem of interfacing FORtran with a C++ application; and (3) A document describing the STAR event record.

Contents:
1.0 Integrating Event Generators
2.0 Integrating FORtran Event Generators
3.0 The STAR Event Record


1.0 Integrating Event Generators

The STAR Event Generator Framework implements several C++ classes which facilitate the integration of FORtran and C++ event generators with the STAR simulation code.  The code is available in the CVS repository and can be checked out as
$ cvs co StRoot/StarGenerator
After checking out the generator area you will note that the code is organized into several directories, containing both CORE packages and concrete event generators.  Specifically:

StarGenerator/BASE  -- contains the classes implementing the STAR interface to event generators
StarGenerator/EVENT -- contains the classes implementing the STAR event record
StarGenerator/UTIL  -- contains random number generator base class and particle data
StarGenerator/TEST  -- contains test makers used for validating the event generators


The concrete event generators (at the time this document was assembled) include

StarGenerator/Hijing1_383
StarGenerator/Pepsi
StarGenerator/Pythia6_4_23
StarGenerator/Pythia8_1_62


1.1 Compiling your Generator

Your first task in integrating a new event generator is to create a directory for it under StarGenerator, and get your code to compile.   You should select a name for your directory which includes the name and version of your event generator.  It should also be CamelCased...  MyGenerator1_2_3, for example.  (Do not select a name which is ALL CAPS, as this has a special meaning for compilation).  Once you have your directory, you can begin moving your source files into the build area.  In general, we would like to minimize the number of edits to the source code to make it compile.  But you may find that you need to reorganize the directory structure of your code to get it to compile under cons.  (For certain, if your FORtran source ends in ".f" you will need to rename the file to ".F", and if your C++ files end in ".cpp" or ".cc", you may need to rename to ".cxx".)

1.2 Creating your Interface

Ok.  So the code compiles.  Now we need to interface the event generation machinery with the STAR framework.  This entails several things.  First, we need to expose the configuration of the event generator so that the end user can generate the desired event sample.  We must then initialize the concrete event generator at the start of the run, and then exercise the event generation machinery on each and every event.  Finally, we need to loop over all of the particles which were created by the event generator and push them onto the event record so that they are persistent (i.e. the full event can be analyzed at a later date) and so that the particles are made available to the Monte Carlo application for simulation.

The base class for all event generator interfaces is  StarGenerator

Taking a quick look at the code, we see that there are several "standard" methods defined for configuring an event generator:
These methods have been defined in order to establish a common interface amongst all event generators in STAR.  These methods set variables defined within the class, from which you will initialize your concrete event generator.
You may need to implement additional methods in order to expose the configuration of your event generator.  You should, of course, do this.

The two methods which StarGenerator requires you to implement are Init() and Generate().  These methods will respectively be called at the start of each run, and during each event.

Init() is responsible for initializing the event generator.  In this method, you should pass any of the configuration information on to your concrete event generator.  This may be through calls to subroutines in your event generator, or by setting values in common blocks.  However this is done, this is the place to do it.

Generate() will be called on every single event during the run.   This is where you should exercise the generation machinery of your event generator.  Every event generator handles this differently, so you will need to consult your manual to figure out the details.

Once Generate() has been called, you are ready to fill the event record.  The event record consists of two parts: (1) the particle record, and (2) the event-wise information describing the physical interaction being simulated.  At a minimum, you will need to fill the particle-wise information.  For more details, see The STAR Event Record  below.


2.0 Integrating FORtran Event Generators

Interfacing a FORtran event generator with ROOT involves (three) steps:

1. Interface the event generator's common blocks (at least the ones which we need to use) to C++
2. Map those common blocks onto C++ structures
3. Expose the C++ structures representing the common blocks to ROOT so that users may modify / access their contents

Let's look at the pythia event generator for a concrete example. 

If you examine the code in StRoot/StarGenerator/Pythia6_4_23/ there is a FORtran file named address.F.  Open that up in your favorite editor and have a look... You'll see several functions defined.  The first one is address_of_pyjets.  In it we declare the PYJETS common block, essentially just cutting and pasting the delcaration from the pythia source code in the same directory.
We use the intrinsic function LOC to return the address (i.e. pointer) to the first variable in the common block.  We have just created a FORtran function which returns a pointer to the data stored in this common block.  The remaining functions in address.F simply expose the remaining common blocks in pythia which we want access to.  By calling this function on the C++ side, we will obtain a pointer to the memory address where the common block resides.

Next we need to describe the memory layout to C++.  This is done in the file Pythia6.h.  Each common block setup in address.F has a corresponding structure defined in this header file.  So, let's take a look at the setup for the PyJets common block:




First, notice the first line where we call the c-preprocessor macro "F77_NAME".  This line handles, in a portable way, the different conventions between FORtran and C++ compilers, when linking together object codes. 

Next, let's discuss "memory layout".  In this section of the code we map the FORtran memory onto a C-structure.   Every variable in the common block should be declared in the same order in the C struct as it was declared in FORtran, and with the corresponding C data type.  These are:
INTEGER          --> Int_t
REAL             --> Float_t
REAL *4          --> Float_t 
REAL *8          --> Double_t 
DOUBLE PRECISION --> Double_t
You probably noticed that there are two differences with the way we have declared the arrays.  First, the arrays all were declared with an "_" in front of their name.  This was a choice on my part, which I will explain in a moment.  The important thing to notice right now is that the indicies on the arrays are reversed, compared to their declarion in FORtran.  "INTEGER K(4000,5)" in FORtran becomes "Int_t _k[5][4000]" in C++.  The reason for this is that C++ and FORtran represent arrays differently in memory.  It is important to keep these differences in mind when mapping the memory of a FORtran common block --

1) The indices in the arrays will always be reversed between FORtran and C --   A(10,20,30) in FORtran becomes A[30][20][10] in C.
2) FORtran indices (by default) start from 1, C++ (always) from 0 --  i.e. B(1) in FORtran would be B[0] in C.
3) FORtran indices may start from any value.  An array declared as D(-10:10) would be declared in C as D[21], and D(-10) in FORtran is D[0] in C.

What about the underscore?

We need to make some design choices at this point.  Specifically, how do we expose the common blocks to the end user?  Do we want the end user to deal with the differences in C++ and FORtran, or do we want to provide a mechanism by which the FORtran behavior (i.e. count from 1, preserve the order of indices) can be emulated.

My preference is to do the latter -- provide the end user functions which emulate the behavior of the FORtran arrays, because these arrays are what is documented in the event generator's manual.   This will minimize the likelyhood that the end user will make mistakes in configuring t he event generator.


So we have created a c-struct which describes how the common block's memory is laid out, and we have defined a function in the FORtran library which returns the location of the memory address of the common block.  Now we need to expose that function to C++.  To do that, we need to declare a prototype of the function.  There are two things we need to do.

First, we need to define the name of the subroutine in a portable way.  This is done using a macro defined in #include "StarCallf77.h" --

#define address_of_pyjets F77_NAME( address_of_pyjets, ADDRESS_OF_PYJETS )

Next we need to declare to C++ that address_of_pyjets can be found in an external library, and will return a pointer to the PyJets_t structure

extern "C" PyJets_t *address_of_pyjets();

Now we are almost done.  We need to add a function in our generator class which returns a pointer (or reference) to the common blocks, and we need to add PyJets_t to the ROOT dictionary... In MyGeneratorLinkDef.h, add the line

#pragma link C++ struct PyJets_t+;

Finally, you need to expose the FORtran subroutines to C++.  Again, take a look at the code in Pythia6.h and Pythia6.cxx.  In the header we declare wrapper functions around the FORtran subroutines, and in the implementation file we expose the FORtran subroutines to C++.

Our first step is declaring the prototypes of the subroutines and implementing the C++ infterface.  Consider the SUBROUTINE PYINIT in pythia, which  initializes the event generator.  In FORtran it is declared as

      SUBROUTINE PYINIT(FRAME,BEAM,TARGET,WIN)
      IMPLICIT DOUBLE PRECISION(A-H, O-Z)
      IMPLICIT INTEGER(I-N)
...
      CHARACTER*(*) FRAME,BEAM,TARGET


So the variables FRAME, BEAM and TARGET are declared as character variables, and WIN is implicitly a double precision variable.

There are several webpages which show how to interface fortran and c++, e.g. http://www.yolinux.com/TUTORIALS/LinuxTutorialMixingFortranAndC.html

It is really a system-dependent thing.  We're keeping it simple and only supporting Linux. 

#define pyinit F77_NAME(pyinit,PYINIT) /* pythia initialization */
extern "C" void   type_of_call  pyinit( const char *frame,
                                        const char *beam,
                                        const char *targ,
                                        double *ener,
                                        int nframe,
                                        int nbeam,
                                        int ntarg );

So there's three character varaibles declared: frame, beam and targ.  These correspond to the character variables on the FORtran side.  Also there's a double precision variable ener... this is WIN.   But then there are three integer variables nframe, nbeam  and ntarg.  FORtran expects to get the size of the character variables when the subroutine is called, and it gets them after the last arguements in the list.

Again, we would like to hide this from the end user... so I like to define wrapper functions such as

#include <string>

void PyInit( string frame, string blue, string yellow, double energy )
{
   pyinit( frame.c_str(), blue.c_str(), yellow.c_str(), &energy,
           frame.size(),
           blue.size(),
           yellow.size() );
}


3.0 The STAR Event Record

List of Event Generators

Event generators currently integrated into starsim using the root4star framework (11/29/12):

  • Pythia 6.4.23
  • Pythia 8.1.62
  • Hijing 1.383
  • Herwig 6.5.20
  • StarLight
  • Pepsi

To run, checkout StRoot/StarGenerator/macros and modify the appropriate example ROOT macro for your purposes.

Event generators currently implemented in the starsim framework (11/29/12):

  • Hijing 1.381
  • Hijing 1.382
  • Pythia 6.2.05
  • Pythia 6.2.20
  • Pythia 6.4.10
  • Pythia 6.4.22
  • Pythia 6.4.26
  • StarLight (fortran)

The STAR Event Record

Geometry Tags



Geometry Tag y2000
   CaveGeo   PipeGeo   UpstGeo   SvttGeo   tpcegeo
  BtofGeo2   CalbGeo   ZcalGeo   MagpGeo


Geometry Tag y2001
   CaveGeo   PipeGeo   UpstGeo  SvttGeo1   tpcegeo
   FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo   CalbGeo
   richgeo   EcalGeo   ZcalGeo   MagpGeo


Geometry Tag y2002
   CaveGeo   PipeGeo   UpstGeo   SvttGeo   tpcegeo
   FtpcGeo   SupoGeo  BtofGeo2   VpddGeo   CalbGeo
   richgeo   BbcmGeo   fpdmgeo   ZcalGeo   MagpGeo


Geometry Tag y2003
   CaveGeo   PipeGeo   UpstGeo   SvttGeo   tpcegeo
   FtpcGeo   SupoGeo  BtofGeo2   VpddGeo   CalbGeo
   EcalGeo   BbcmGeo   fpdmgeo   ZcalGeo   MagpGeo


Geometry Tag y2003x
   CaveGeo   PipeGeo   UpstGeo  SvttGeo2   tpcegeo
   FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo   CalbGeo
   EcalGeo   BbcmGeo   fpdmgeo   ZcalGeo   MagpGeo
   PhmdGeo


Geometry Tag y2004a
   CaveGeo   PipeGeo   UpstGeo  SvttGeo3   tpcegeo
   FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo  CalbGeo1
   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo   MagpGeo
   SisdGeo   PhmdGeo


Geometry Tag y2004c
   CaveGeo   PipeGeo   UpstGeo  SvttGeo4  TpceGeo1
   FtpcGeo  SupoGeo1  BtofGeo2   VpddGeo  CalbGeo1
   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo   MagpGeo
  SisdGeo1   PhmdGeo


Geometry Tag y2004y
   CaveGeo   PipeGeo   UpstGeo  SvttGeo4  TpceGeo1
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo2   VpddGeo
   CalbGeo   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo2   PhmdGeo


Geometry Tag y2005
   CaveGeo   PipeGeo   UpstGeo  SvttGeo3   tpcegeo
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo2   VpddGeo
  CalbGeo1   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo2   PhmdGeo


Geometry Tag y2005b
   CaveGeo   PipeGeo   UpstGeo  SvttGeo4  TpceGeo1
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo2   VpddGeo
  CalbGeo1   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo2   PhmdGeo


Geometry Tag y2005f
   CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo1
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo6   PhmdGeo


Geometry Tag y2005g
   CaveGeo   PipeGeo   UpstGeo SvttGeo11  TpceGeo1
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo6   PhmdGeo


Geometry Tag y2005h
   CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo6   PhmdGeo


Geometry Tag y2005i
   CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo6   PhmdGeo


Geometry Tag y2006
   CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo2
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo1   EcalGeo   BbcmGeo  FpdmGeo1   ZcalGeo
   MagpGeo  SisdGeo3   MutdGeo   PhmdGeo


Geometry Tag y2006c
   CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo2
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo2   ZcalGeo
   MagpGeo  SisdGeo6   MutdGeo


Geometry Tag y2006g
   CaveGeo   PipeGeo   UpstGeo SvttGeo11  TpceGeo2
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo2   ZcalGeo
   MagpGeo  SisdGeo6   MutdGeo


Geometry Tag y2006h
   CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo4   VpddGeo
  CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo2   ZcalGeo
   MagpGeo  SisdGeo6   MutdGeo


Geometry Tag y2007
   CaveGeo   PipeGeo   UpstGeo  SvttGeo6  TpceGeo2
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo5  VpddGeo2
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  SisdGeo6   MutdGeo   PhmdGeo


Geometry Tag y2007g
   CaveGeo   PipeGeo   UpstGeo SvttGeo11  TpceGeo2
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo5  VpddGeo2
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  SisdGeo6   MutdGeo   PhmdGeo


Geometry Tag y2007h
   CaveGeo   PipeGeo   UpstGeo SvttGeo11  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo5  VpddGeo2
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  SisdGeo6   MutdGeo   PhmdGeo


Geometry Tag y2008
   CaveGeo   PipeGeo   UpstGeo  TpceGeo2  FtpcGeo1
  SupoGeo1   FtroGeo  BtofGeo6  VpddGeo2  CalbGeo2
   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo   MagpGeo
  MutdGeo3


Geometry Tag y2008a
   CaveGeo   PipeGeo   UpstGeo   SconGeo  TpceGeo2
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo6  VpddGeo2
  CalbGeo2   EcalGeo   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  MutdGeo3


Geometry Tag y2008b
   CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo6  VpddGeo2
  CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  MutdGeo3


Geometry Tag y2008c
   CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo7  VpddGeo2
  CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  MutdGeo3


Geometry Tag y2008d
   CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo7  VpddGeo2
  CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  MutdGeo3


Geometry Tag y2008e
   CaveGeo   PipeGeo   UpstGeo   SconGeo  tpcegeo3
  FtpcGeo1  SupoGeo1   FtroGeo  BtofGeo7  VpddGeo2
  CalbGeo2  EcalGeo6   BbcmGeo  FpdmGeo3   ZcalGeo
   MagpGeo  MutdGeo3


Geometry Tag y2009
   EcalGeo   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
   BbcmGeo


Geometry Tag y2009a
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
   BbcmGeo


Geometry Tag y2009b
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
   BbcmGeo


Geometry Tag y2009c
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
   BbcmGeo


Geometry Tag y2009d
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   UpstGeo   ZcalGeo   CaveGeo   MagpGeo
   BbcmGeo


Geometry Tag y2010
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo


Geometry Tag y2010a
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo6  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo


Geometry Tag y2010b
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo


Geometry Tag y2010c
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo3 TpceGeo3a  CalbGeo2
   SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo


Geometry Tag y2011
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo4 TpceGeo3a  CalbGeo2
   SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo


Geometry Tag y2011a
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  FtpcGeo1   FtroGeo  MutdGeo4 TpceGeo3a  CalbGeo2
   SconGeo   PhmdGeo   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo


Geometry Tag y2012
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1


Geometry Tag y2012a
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1


Geometry Tag y2012b
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1


Geometry Tag y2013
  EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
  PixlGeo5  PxstGeo1  DtubGeo1


Geometry Tag y2013_1
  EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
  PixlGeo5  PxstGeo1  DtubGeo1


Geometry Tag y2013_2
  EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
  PxstGeo1


Geometry Tag y2013_1x
  EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
  CaveGeo2   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
  PixlGeo5  PxstGeo1  DtubGeo1


Geometry Tag y2013x
  EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
  CaveGeo2   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
  PixlGeo5  PxstGeo1  DtubGeo1


Geometry Tag y2013_2x
  EcalGeo6  PipeGeo2  FpdmGeo3  BtofGeo8  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   UpstGeo   ZcalGeo
  CaveGeo2   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1
  PxstGeo1


Geometry Tag dev14
  EcalGeo6  PipeGeo1  FpdmGeo3  BtofGeo7  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   CaveGeo   BbcmGeo
  SisdGeo7  FgtdGeo3  IdsmGeo1  PixlGeo4  IstdGeo0
  PxstGeo1


Geometry Tag complete
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  MutdGeo4 TpceGeo3a  CalbGeo2   PhmdGeo   UpstGeo
   ZcalGeo   CaveGeo   MagpGeo   BbcmGeo  FgtdGeo3
  IdsmGeo1


Geometry Tag devT
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  MutdGeo4  CalbGeo2   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo  FgtdGeo3  IdsmGeo1   FsceGeo
   EiddGeo  TpcxGeo1


Geometry Tag eStar2
  EcalGeo6   PipeGeo  FpdmGeo3  BtofGeo7  VpddGeo2
  MutdGeo4  CalbGeo2   UpstGeo   ZcalGeo   CaveGeo
   MagpGeo   BbcmGeo  FgtdGeoV  IdsmGeo1   FsceGeo
   EiddGeo  TpcxGeo2

Material Budget Y2013

Material budget in the Y2013 (X) geometry.  The top left plots number of radiation lengths encounted by a straight track at the given eta, phi.  The top right (bottom left) compares the ROOT and STARSIM geometries generated by AgML plotted vs phi (eta).  These are averaged over the other variable.  ROOT geometry in black, STARSIM in red.  The bottom right shows the difference in ROOT - STARSIM geometries vs phi and eta.  Less than 0.01 radiation lengths difference found integrated over the entire cave. 

Attached are material budget plots and differences for major subsystems.  Each PDF contains the material budget plots displaying number of radiation lengths averaged over all phi for the ROOT (left) and STARSIM (right) geometries created by AgML. The material difference plot is as described above.

Miscellaneous production scripts

This page has been created with the purpose to systematize the various scripts currently used in the Monte Carlo production and testing. The contents will be updated as needed, however the codes are presumed to be correct and working at any given time.

Jobs catalog

When running on rcas, we typically use a legacy csh script named "alljobs". It parses the job configuration file named "catalog" and dispatches a single job on the target node, which can be an interactive node if run interactively, or a batch node if submitted to a queue. The alljobs script expects to see the following directory structure: a writeable directory with the name of the dataset being produced, and directly under it, a writeable "log" directory, in which it would deposit the so-called token files, which serve two purposes:

  • help in sequential numbering of the output files
  • in case of multiple input files (such as Hijing event files) allow to map N files to M jobs, thus managing the workload

The catalog file is effectively a table, in white-space separated format. Each line begins with the dataset name which is a three-letter acronym of the site name (and thus either rcf or pds) followed by a 4-digit serial number of the set. The alljobs script expects to find a directory named identically to the dataset, under the "job directory", which in the current version of the script is hardcoded as /star/simu/simu/gstardata. This, of course, can be improved or changed.

The last field in each entry is used to construct the so-called tag, which plays an important role: it effectively defined the location of the Monte Carlo data in the HPSS, when the data is sunk there (this is done by a separate script). In addition, it also defines keys for the entries in the FileCatalog (reconstructed data). The alljobs script creates a file of zero length, with a name which is a period-separated catenation of the word "tag" and the contents of the last column in the line.

Here are the contents of the catalog file as it existed from the late 1990-s to the end of 2006

rcf0101 auau200/nexus/default/central evgen.*.nt auau200.nexus.default.b0_3.year_1h.hadronic_on
rcf0105 auau200/nexus/default/minbias evgen.*.nt auau200.nexus.default.minbias.year_1h.hadronic_on
pds0101 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
pds0102 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
pds0103 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
pds0104 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on
rcf0096 auau200/vni/default/central evgen.*.nt auau200.vni.default.b0_3.year_1h.hadronic_on

pds0105 auau200/mevsim/vanilla_trigger/central evgen.*.nt auau200.mevsim.vanilla.trigger.year_1h.hadronic_on
rcf0097 auau200/mevsim/vanilla_resonance/central evgen.*.nt auau200.mevsim.vanilla.resonance.year_1h.hadronic_on
rcf0098 auau200/mevsim/vanilla_trigger/central evgen.*.nt auau200.mevsim.vanilla.trigger.year_1h.hadronic_on
rcf0095 auau200/mevsim/vanilla_flow/central evgen.*.nt auau200.mevsim.vanilla.flow.year_1h.hadronic_on
rcf0099 auau200/mevsim/vanilla_fluct/central evgen.*.nt auau200.mevsim.vanilla.fluct.year_1h.hadronic_on
rcf0102 auau200/mevsim/vanilla/central evgen.*.nt auau200.mevsim.vanilla.central.year_1h.hadronic_on
rcf0103 auau200/mevsim/vanilla/central evgen.*.nt auau200.mevsim.vanilla.central.year_1h.hadronic_on
rcf0104 auau200/mevsim/vanilla_flow/central evgen.*.nt auau200.mevsim.vanilla.flow.year_1h.hadronic_on
rcf0100 auau200/mevsim/cascade/central evgen.*.nt auau200.mevsim.cascade.central.year_1h.hadronic_on

rcf0106 auau200/hbt/default/peripheral evgen.*.nt auau200.hbt.default.peripheral.year_1h.hadronic_on
rcf0107 auau200/hbt/default/midperipheral evgen.*.nt auau200.hbt.default.midperipheral.year_1h.hadronic_on
rcf0108 auau200/hbt/default/middle evgen.*.nt auau200.hbt.default.middle.year_1h.hadronic_on
rcf0109 auau200/hbt/default/midcentral evgen.*.nt auau200.hbt.default.midcentral.year_1h.hadronic_on
rcf0110 auau200/hbt/default/central evgen.*.nt auau200.hbt.default.central.year_1h.hadronic_on

rcf0111 none hijing.*.xdf auau200.hijing.b0_3_jetq_on.jet05.year_1h.hadronic_on
rcf0112 none hijing.*.xdf auau200.hijing.b0_3_jetq_off.jet05.year_1h.hadronic_on
rcf0113 none hijing.*.xdf auau200.hijing.b8_15_jetq_on.jet05.year_1h.hadronic_on
rcf0114 none hijing.*.xdf auau200.hijing.b8_15_jetq_off.jet05.year_1h.hadronic_on
rcf0115 none hijing.*.xdf auau200.hijing.b0_3_jetq_on.jet05.year_1h.hadronic_on
rcf0116 none hijing.*.xdf auau200.hijing.b0_3_jetq_off.jet05.year_1h.hadronic_on
rcf0117 none hijing.*.xdf auau200.hijing.b8_15_jetq_on.jet05.year_1h.hadronic_on
rcf0118 none hijing.*.xdf auau200.hijing.b8_15_jetq_off.jet05.year_1h.hadronic_on
rcf0119 none hijing.*.xdf pau200_hijing_b0_7_jet15_year_1h.hadronic_on
rcf0120 none hijing.*.xdf pau200_hijing_b0_7_gam15_year_1h_hadronic_on

rcf0121 pec/dtunuc dtu*.xdr auau200.dtunuc.two_photon.none.year_1h.hadronic_on
rcf0122 pec/starlight starlight_vmeson_*.t auau200.starlight.vmeson.none.year_1h.hadronic_on
rcf0123 pec/starlight starlight_2gamma_*.nt auau200.starlight.2gamma.none.year_1h.hadronic_on
rcf0124 pec/hemicosm events.txt auau200.hemicosm.default.none.year_1h.hadronic_on

rcf0125 pec/dtunuc dtu*.xdr auau200.dtunuc.two_photon.halffield.year_1h.hadronic_on
rcf0126 pec/starlight starlight_vmeson_*.t auau200.starlight.vmeson.halffield.year_1h.hadronic_on
rcf0127 pec/starlight starlight_2gamma_*.t auau200.starlight.2gamma.halffield.year_1h.hadronic_on

rcf0131 pec/beamgas venus.h.*.nt auau200.hijing.beamgas.hydrogen.year_1h.hadronic_on
rcf0132 pec/beamgas venus.n.*.nt auau200.hijing.beamgas.nitrogen.year_1h.hadronic_on

rcf0139 none hijev.inp auau128.hijing.b0_12.halffield.year_1e.hadronic_on
rcf0140 none hijev.inp auau128.hijing.b0_3.halffield.year_1e.hadronic_on

rcf0141 auau200/strongcp/broken/eb_400_90 evgen.*.nt auau200.strongcp.broken.eb-400_90.year_1h.hadronic_on
rcf0142 auau200/strongcp/broken/eb_400_00 evgen.*.nt auau200.strongcp.broken.eb-400_00.year_1h.hadronic_on
rcf0143 auau200/strongcp/broken/lr_eb_400_90 evgen.*.nt auau200.strongcp.broken.lr_eb_400_90.year_1h.hadronic_on

rcf0145 none hijev.inp auau130.hijing.b0_3.jet05.year_1h.halffield.hadronic_on
rcf0146 none hijev.inp auau130.hijing.b0_15.default.year_1h.halffield.hadronic_on
rcf0147 none hijev.inp auau130.hijing.b0_3.default.year_1e.halffield.hadronic_on
rcf0148 none hijev.inp auau130.hijing.b3_6.default.year_1e.halffield.hadronic_on

rcf0151 auau130/mevsim/vanilla_flow/central evgen.*.nt auau130.mevsim.vanilla_flow.central.year_1e.hadronic_on
rcf0152 auau130/mevsim/vanilla_trigger/central evgen.*.nt auau130.mevsim.vanilla_trigger.central.year_1e.hadronic_on
rcf0153 auau130/mevsim/vanilla_dynamic/central evgen.*.nt auau130.mevsim.vanilla_dynamic.central.year_1e.hadronic_on
rcf0154 auau130/mevsim/vanilla_omega/central evgen.*.nt auau130.mevsim.vanilla_omega.central.year_1e.hadronic_on
rcf0155 auau130/mevsim/vanilla/central evgen.*.nt auau130.mevsim.vanilla.central.year_1e.hadronic_on
rcf0156 auau130/nexus/default/central evgen.*.nt auau130.nexus.default.b0_3.year_1e.hadronic_on

rcf0159 rqmd auau_b0-14.*.cwn auau200.rqmd.default.b0_14.year_1h.hadronic_on
rcf0160 rqmd auau_b0-15.*.cwn auau200.rqmd.default.b0_15.year_1h.hadronic_on

rcf0161 auau130/mevsim/vanilla_flow/central evgen.*.nt auau130.mevsim.vanilla_flow.central.year_1h.hadronic_on
rcf0162 auau130/mevsim/vanilla_trigger/central evgen.*.nt auau130.mevsim.vanilla_trigger.central.year_1h.hadronic_on
rcf0163 auau130/mevsim/vanilla_dynamic/central evgen.*.nt auau130.mevsim.vanilla_dynamic.central.year_1h.hadronic_on
rcf0164 auau130/mevsim/vanilla_omega/central evgen.*.nt auau130.mevsim.vanilla_omega.central.year_1h.hadronic_on
rcf0165 auau130/mevsim/vanilla/central evgen.*.nt auau130.mevsim.vanilla.central.year_1h.hadronic_on
rcf0166 auau130/mevsim/vanilla_resonance/central evgen.*.nt auau130.mevsim.vanilla_resonance.central.year_1h.hadronic_on
pds0167 auau130/mevsim/vanilla_cocktail/central evgen.*.nt auau130.mevsim.vanilla_cocktail.central.year_1h.hadronic_on
rcf0168 auau130/mevsim/vanilla_flow/mbias evgen.*.nt auau130.mevsim.vanilla_flow.minbias.year_1h.hadronic_on
rcf0169 auau130/mevsim/vanilla_flowb/central evgen.*.nt auau130.mevsim.vanilla_flowb.central.year_1h.hadronic_on

rcf0171 auau130/mevsim/vanilla_lambda_antilambda/central evgen.*.nt auau130.mevsim.vanilla_both_lambda.central.year_1h.hadronic_on
rcf0172 auau130/mevsim/vanilla_lambda/central evgen.*.nt auau130.mevsim.vanilla_lambda.central.year_1h.hadronic_on
rcf0173 auau130/mevsim/vanilla_antilambda/central evgen.*.nt auau130.mevsim.vanilla_antilambda.central.year_1h.hadronic_on

rcf0181 auau200/mevsim/mdc4/central evgen.*.nt auau200.mevsim.mdc4_cocktail.central.year2001.hadronic_on
pds0182 auau200/mevsim/mdc4/central evgen.*.nt auau200.mevsim.mdc4_cocktail.central.year2001.hadronic_on

rcf0183 none hijev.inp auau200.hijing.b0_20.standard.year2001.hadronic_on
rcf0184 none hijev.inp auau200.hijing.b0_3.standard.year2001.hadronic_on

rcf0190 auau200/mevsim/mdc4_electrons evgen.*.nt auau200.mevsim.mdc4_electrons.year2001.hadronic_on

rcf0191 none hijev.inp auau200.hijing.b0_20.inverse.year2001.hadronic_on
rcf0192 none hijev.inp auau200.hijing.b0_3.inverse.year2001.hadronic_on
rcf0193 none hijev.inp dau200.hijing.b0_20.standard.year_2a.hadronic_on

# Maxim has arrived:
# the following two runs had the 1 6 setting for the hard scattering and energy
rcf0194 none hijev.inp dau200.hijing.b0_20.jet06.year2003.hadronic_on
pds0195 none hijev.inp dau200.hijing.b0_20.jet06.year2003.hadronic_on
# this one had 1 3
rcf0196 none hijev.inp dau200.hijing.b0_20.jet03.year2003.hadronic_on
# standard 0 2 setting
rcf0197 none hijev.inp dau200.hijing.b0_20.jet02.year2003.hadronic_on
# new numbering
rcf1197 none hijev.inp dau200.hijing.b0_20.minbias.year2003.hadronic_on
rcf1198 dau200/hijing_382/b0_20/minbias evgen.*.nt dau200.hijing_382.b0_20.minbias.year2003.gheisha_on
# dedicated wide Z run
rcf1199 dau200/hijing_382/b0_20/minbias evgen.*.nt dau200.hijing_382.b0_20.minbias_wideZ.year2003.hadronic_on
# Pythia
rcf1200 none pyth.dat pp200.pythia6_203.default.minbias.year2003.hadronic_on
# Heavy flavor embedding with full calorimeter
rcf1201 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2003x.gheisha_on
# Pythia hi Pt>5
rcf1202 none pyth.dat pp200.pythia6_203.default.pt5.year2003.gheisha_on
# Mevsim fitted to 200GeV AuAu
rcf1203 auau200/mevsim/v2/central_6 evgen.*.nt auau200.mevsim.v2.b0_6.year_1e.gheisha_on
# Mevsim fitted to 200GeV AuAu, different geo
rcf1204 auau200/mevsim/v2/central_6 evgen.*.nt auau200.mevsim.v2.b0_6.year2001.gheisha_on
# Pythia hi Pt>15
rcf1205 none pyth.dat pp200.pythia6_203.default.pt15.year2003.gheisha_on
# Starsim maiden voyage, with y2004, 62.4 GeV
rcf1206 auau62/hijing_382/b0_20/minbias evgen.*.nt auau62.hijing_382.b0_20.minbias.y2004.gheisha_on
# 62.4 GeV central
rcf1207 auau62/hijing_382/b0_3/central evgen.*.nt auau62.hijing_382.b0_3.central.y2004a.gheisha_on
pds1207 auau62/hijing_382/b0_3/central evgen.*.nt auau62.hijing_382.b0_3.central.y2004a.gheisha_on
# 200 GeV minbias
rcf1208 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2004a.gheisha_on
# 200 GeV central
rcf1209 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2004a.gheisha_on
# Pythia
rcf1210 none pyth.dat pp200.pythia6_203.default.minbias.y2004a.gheisha_on
# Pythia Spin group
rcf1211 none pyth.dat pp200.pythia6_203.default.minbias.y2004x.gheisha_on
# Pythia Spin group
pds1212 none pyth.dat pp200.pythia6_203.default.pt3.y2004x.gheisha_on
# Pythia Spin group
rcf1213 none pyth.dat pp200.pythia6_205.default.pt7.y2004x.gheisha_on
# Pythia Spin group
pds1214 none pyth.dat pp200.pythia6_203.default.pt15.y2004x.gheisha_on
# 200 GeV minbias special D decays
rcf1215 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.speciald.y2004a.gheisha_on
# 200 GeV minbias copper
rcf1216 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2005x.gheisha_on
# 200 GeV minbias copper test
rcf1217 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2004a.gheisha_on
# 200 GeV central reprise of 1209, smaller diamond
rcf1218 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2004a.gheisha_on
# Pythia Special 1
rcf1219 none pyth.dat pp200.pythia6_203.default.special1.y2004c.gheisha_on
# Pythia Special 2 (CDF A)
rcf1220 none pyth.dat pp200.pythia6_203.default.special2.y2004c.gheisha_on
# Pythia Special 3
rcf1221 none pyth.dat pp200.pythia6_203.default.special3.y2004c.gheisha_on
# Pythia Special 2 4<Pt<5 Gheisha
rcf1222 none pyth.dat pp200.pythia6_203.default.special2.y2004y.gheisha_on
# Pythia Special 2 4<Pt<5 GCALOR
rcf1223 none pyth.dat pp200.pythia6_203.default.special2.y2004y.gcalor_on
# Pythia Special 2 (CDF A) 5-7 GeV 6/28/05
rcf1224 none pyth.dat pp200.pythia6_205.5_7gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 7-9 GeV 6/28/05
rcf1225 none pyth.dat pp200.pythia6_205.7_9gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 9-11 GeV 6/28/05
rcf1226 none pyth.dat pp200.pythia6_205.9_11gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 11-15 GeV 6/28/05
rcf1227 none pyth.dat pp200.pythia6_205.11_15gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 15-25 GeV 6/29/05
rcf1228 none pyth.dat pp200.pythia6_205.15_25gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 25-35 GeV 6/29/05
rcf1229 none pyth.dat pp200.pythia6_205.25_35gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) > 35 GeV 6/29/05
rcf1230 none pyth.dat pp200.pythia6_205.above_35gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 2-3 GeV 6/29/05
rcf1231 none pyth.dat pp200.pythia6_205.2_3gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 3-4 GeV 6/30/05
rcf1232 none pyth.dat pp200.pythia6_205.3_4gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 4-5 GeV 6/30/05
rcf1233 none pyth.dat pp200.pythia6_205.4_5gev.cdf_a.y2004y.gheisha_on
# Pythia Special 6/30/05
rcf1234 none pyth.dat pp200.pythia6_205.low_energy.cdf_a.y2004y.gheisha_on
# Pythia min bias 9/06/05
rcf1235 none pyth.dat pp200.pythia6_205.min_bias.cdf_a.y2004y.gheisha_on
# Herwig 5-7 GeV 9/07/05
rcf1236 pp200/herwig6507/pt_5_7 evgen.*.nt pp200.herwig6507.5_7gev.special1.y2004y.gheisha_on
# 62.4 GeV minbias copper
rcf1237 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.minbias.y2005c.gheisha_on
# 100 pi0 per event SVT in
rcf1238 none run1238.kumac pi0.100per_event.200mev_15gev.svtt_on.y2005x.gheisha_on
# 100 pi0 per event SVT out
rcf1239 none run1239.kumac pi0.100per_event.200mev_15gev.svtt_off.y2005x.gheisha_on
# 10 J/psi per event SVT in
rcf1240 none run1240.kumac jpsi.10per_event.500mev_3gev.svtt_on.y2005x.gheisha_on
# 10 J/psi per event SVT out
rcf1241 none run1241.kumac jpsi.10per_event.500mev_3gev.svtt_off.y2005x.gheisha_on
# 62.4 GeV minbias copper low EM cut
rcf1242 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em.y2005c.gheisha_on
# 62.4 GeV minbias copper low EM and keep tracks
rcf1243 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em_keep.y2005c.gheisha_on
# Herwig 9-11 GeV 10/13/05
rcf1244 pp200/herwig6507/pt_9_11 evgen.*.nt pp200.herwig6507.9_11gev.special1.y2004y.gheisha_on
# Herwig 11-15 GeV 10/13/05
rcf1245 pp200/herwig6507/pt_11_15 evgen.*.nt pp200.herwig6507.11_15gev.special1.y2004y.gheisha_on
# Herwig 15-25 GeV 10/13/05
rcf1246 pp200/herwig6507/pt_15_25 evgen.*.nt pp200.herwig6507.15_25gev.special1.y2004y.gheisha_on
# Herwig 25-35 GeV 10/13/05
rcf1247 pp200/herwig6507/pt_25_35 evgen.*.nt pp200.herwig6507.25_35gev.special1.y2004y.gheisha_on
# Herwig 35-45 GeV 10/13/05
rcf1248 pp200/herwig6507/pt_35_45 evgen.*.nt pp200.herwig6507.35_45gev.special1.y2004y.gheisha_on
# 200 GeV minbias
rcf1249 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2005d.gheisha_on
#
# New Herwig Wave
#
# Herwig 9-11 GeV new header 12/14/05
rcf1250 pp200/herwig6507/pt_9_11 evgen.*.nt pp200.herwig6507.9_11gev.special3.y2004y.gheisha_on
# Herwig 11-15 GeV new header 11/10/05
rcf1251 pp200/herwig6507/pt_11_15 evgen.*.nt pp200.herwig6507.11_15gev.special3.y2004y.gheisha_on
# Herwig 15-25 GeV new header 12/19/05
rcf1252 pp200/herwig6507/pt_15_25 evgen.*.nt pp200.herwig6507.15_25gev.special3.y2004y.gheisha_on
# Herwig 25-35 GeV new header 12/19/05
rcf1253 pp200/herwig6507/pt_25_35 evgen.*.nt pp200.herwig6507.25_35gev.special3.y2004y.gheisha_on
# Herwig 35-100 GeV new header 12/19/05
rcf1254 pp200/herwig6507/pt_35_100 evgen.*.nt pp200.herwig6507.35_100gev.special3.y2004y.gheisha_on
# Herwig 2-3 GeV new header 12/14/05
rcf1255 pp200/herwig6507/pt_2_3 evgen.*.nt pp200.herwig6507.2_3gev.special3.y2004y.gheisha_on
# Herwig 3-4 GeV new header 12/14/05
rcf1256 pp200/herwig6507/pt_3_4 evgen.*.nt pp200.herwig6507.3_4gev.special3.y2004y.gheisha_on
# Herwig 4-5 GeV new header 12/21/05
rcf1257 pp200/herwig6507/pt_4_5 evgen.*.nt pp200.herwig6507.4_5gev.special3.y2004y.gheisha_on
# Herwig 5-7 GeV new header 12/21/05
rcf1258 pp200/herwig6507/pt_5_7 evgen.*.nt pp200.herwig6507.5_7gev.special3.y2004y.gheisha_on
# Herwig 7-9 GeV new header 12/21/05
rcf1259 pp200/herwig6507/pt_7_9 evgen.*.nt pp200.herwig6507.7_9gev.special3.y2004y.gheisha_on
#
# Heavy flavor embedding with full calorimeter
rcf1260 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2004y.gheisha_on
# 200 GeV minbias copper
rcf1261 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.minbias.y2006.gheisha_on
# 62.4 GeV minbias copper
rcf1262 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.minbias.y2006.gheisha_on
#
#

# Specialized tracking studies
#
# 62.4 GeV minbias copper low EM and keep tracks
rcf1263 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.low_em_keep.y2005d.gheisha_on
# Same as prev, distortion
rcf1264 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.distort.y2005d.gheisha_on
# Same as prev, distortion with clams
rcf1265 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.clamdist.y2005d.gheisha_on
# Same as prev, clams and two ladders offset
rcf1266 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.clamlad.y2005d.gheisha_on
# Individual ladder offsets
rcf1267 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.indilad.dev2005.gheisha_on
# Global ladder tilts
rcf1268 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.ladtilt.dev2005.gheisha_on
# Individual ladder tilts
rcf1269 cucu62/hijing_382/b0_14/minbias evgen.*.nt cucu62.hijing_382.b0_14.indtilt.dev2005.gheisha_on
#
#
# Spin PWG requests:
# Pythia Special 2 (CDF A) 45-55 GeV 5/09/06
rcf1270 none pyth.dat pp200.pythia6_205.45_55gev.cdf_a.y2004y.gheisha_on
# Pythia Special 2 (CDF A) 55-65 GeV 5/10/06
rcf1271 none pyth.dat pp200.pythia6_205.55_65gev.cdf_a.y2004y.gheisha_on
#
rcf1272 cucu200/hijing_382/b0_14/minbias evgen.*.nt cucu200.hijing_382.b0_14.D0minbias.y2006.gheisha_on
#
# Pythia Special 2 (CDF A) 0-2 GeV 7/20/06
rcf1273 none pyth.dat pp200.pythia6_205.0_2gev.cdf_a.y2004y.gheisha_on
# UPGR02 eta+-1.5
rcf1274 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr02.gheisha_on
#
# Pythia min bias 7/27/06
rcf1275 none pyth.dat pp200.pythia6_205.minbias.cdf_a.y2006.gheisha_on
#
# UPGR05
rcf1276 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr05.gheisha_on
#
# UPGR05 wide diamond (60,300)
rcf1277 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.wide.upgr05.gheisha_on
# UPGR07 wide diamond (60,300)
rcf1278 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.wide.upgr07.gheisha_on
# UPGR07
rcf1279 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr07.gheisha_on
# UPGR01
rcf1280 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr01.gheisha_on
# UPGR08
rcf1281 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr08.gheisha_on
# UPGR06
rcf1282 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr06.gheisha_on
# UPGR09
rcf1283 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr09.gheisha_on
# UPGR09 central
rcf1284 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr09.gheisha_on
# UPGR10
rcf1285 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr10.gheisha_on
# UPGR10 central
rcf1286 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr10.gheisha_on
# UPGR11
rcf1287 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr11.gheisha_on
# UPGR11 central
rcf1288 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr11.gheisha_on
# UPGR06 central
rcf1289 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr06.gheisha_on
# UPGR07
rcf1290 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr07.gheisha_on

Here is the actual version of the file used in the 2007 runs:

e w en b jq geom
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1291 none pyth.dat pp200.pythia6_205.special.diamond10.upgr07.gheisha_on
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1292 none pyth.dat pp500.pythia6_205.special.diamond10.upgr07.gheisha_on
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1293 none pyth.dat pp200.pythia6_205.special.diamond30.upgr07.gheisha_on
# UPGR07 01/17/2007 ISUB = 11, 12, 13, 28, 53, 68
rcf1294 none pyth.dat pp500.pythia6_205.special.diamond30.upgr07.gheisha_on
# Min bias gold, pilot run for 2007
rcf1295 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.y2007.gheisha_on
# Central auau200 + B-mixing Central auau200 + Upsilon (S1,S2,S3) mixing
rcf1296 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2007.gheisha_on
# Minbias for TUP (wide vertex)
rcf1297 auau200/hijing_382/b0_20/minbias evgen.*.nt auau200.hijing_382.b0_20.minbias.upgr13.gheisha_on
#
#
# Central auau200 + D0-mixing, UPGR13
rcf1298 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.upgr13.gheisha_on
# Min bias Pythia
rcf1299 none pyth.dat pp200.pythia6_205.minbias.cdf_a.y2005.gheisha_on
# Pythia, UPGR13
rcf1300 none pyth.dat pp200.pythia6_205.charm.cdf_a.upgr13.gheisha_on
# Pythia wide diamond
rcf1301 none pyth.dat pp200.pythia6_205.minbias.wide.upgr13.gheisha_on
# Pythia
rcf1302 none pyth.dat pp200.pythia6_410.45_55gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1303 none pyth.dat pp200.pythia6_410.35_45gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1304 none pyth.dat pp200.pythia6_410.55_65gev.cdf_a.y2006c.gheisha_on
# Placeholder XXXXXXXXXXX
rcf1305 auau200/hijing_382/b0_3/central evgen.*.nt auau200.hijing_382.b0_3.central.y2007.gheisha_on
# Pythia
rcf1306 none pyth.dat pp200.pythia6_410.25_35gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1307 none pyth.dat pp200.pythia6_410.15_25gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1308 none pyth.dat pp200.pythia6_410.11_15gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1309 none pyth.dat pp200.pythia6_410.9_11gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1310 none pyth.dat pp200.pythia6_410.7_9gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1311 none pyth.dat pp200.pythia6_410.5_7gev.cdf_a.y2006c.gheisha_on
# Pythia CKIN(3)=7, CKIN(4)=9, CKIN(7)=0.0, CKIN(8)=1.0, CKIN(27)=-0.4, CKIN(28)=0.4
rcf1312 none pyth.dat pp200.pythia6_410.7_9gev.bin1.y2004y.gheisha_on
# Pythia CKIN(3)=9, CKIN(4)=11, CKIN(7)=-0.4, CKIN(8)=1.4, CKIN(27)=-0.5, CKIN(28)=0.6
rcf1313 none pyth.dat pp200.pythia6_410.9_11gev.bin2.y2004y.gheisha_on
# Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=-0.2, CKIN(8)=1.2, CKIN(27)=-0.6, CKIN(28)=-0.3
rcf1314 none pyth.dat pp200.pythia6_410.11_15gev.bin3.y2004y.gheisha_on
# Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=-0.5, CKIN(8)=1.5, CKIN(27)=-0.3, CKIN(28)=0.4
rcf1315 none pyth.dat pp200.pythia6_410.11_15gev.bin4.y2004y.gheisha_on
# Pythia CKIN(3)=11, CKIN(4)=15, CKIN(7)=0.0, CKIN(8)=1.0, CKIN(27)=0.4, CKIN(28)=0.7
rcf1316 none pyth.dat pp200.pythia6_410.11_15gev.bin5.y2004y.gheisha_on
# Pythia
rcf1317 none pyth.dat pp200.pythia6_410.4_5gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1318 none pyth.dat pp200.pythia6_410.3_4gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1319 none pyth.dat pp200.pythia6_410.minbias.cdf_a.y2006c.gheisha_on
# Pythia
rcf1320 none pyth.dat pp62.pythia6_410.4_5gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1321 none pyth.dat pp62.pythia6_410.3_4gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1322 none pyth.dat pp62.pythia6_410.5_7gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1323 none pyth.dat pp62.pythia6_410.7_9gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1324 none pyth.dat pp62.pythia6_410.9_11gev.cdf_a.y2006c.gheisha_on
# Pythia
rcf1325 none pyth.dat pp62.pythia6_410.11_15gev.cdf_a.y2006c.gheisha_on
# Pythia Special 2 (CDF A) 2-3 GeV 6/29/05
pds1231 none pyth.dat pp200.pythia6_205.2_3gev.cdf_a.y2004y.gheisha_on

When submitting jobs on the Grid, most of the functionality in alljobs is redundant. The simplified scripts can be found in the "Grid-Friendly" section of this site.

Merger/filtering script

Typically, a Starsim run will result in an output which is a file, or a series of files with names like gstar.1.fz, gstar.2.fz etc. Regardless of whether we run locally or on the Grid, there is a small chance that the file(s) will be truncated. To guard against the possibility of feeding up incorrect data to the reconstruction stage, and/or performing a split or merger of a few file, a KUMAC script has been developed. It will, among other things, discard incomplete events, and produce serially numbered files with names like rcf1319_01_100evts.fzd, which contains the name of the dataset, the serial number of the file (distinct from the numbering of the input files), and the number of events contained therein, all of which is helpful in setting up or debugging the production. It has recently been simplified (although still not easily readable), and wrapped into a utility shell script, which does preparation work as well as cleanup. The resulting script, named "filter.tcsh", takes a single argument which is assumed to be the name of the dataset (and which is then used in naming the output files).

#! /usr/local/bin/tcsh -f
#
# remove the old list of files
if( -e process.list ) then
rm process.list
endif
#
if( -e filter.kumac ) then
rm filter.kumac
endif
ls gstar.*.fz | sed -e 's/[gstar.|.fz]//g' | sort -n > process.list
#
# clean the trash bin before the next run, re-create
rm -fr trash
mkdir trash
echo `du --block-size=1000K -s | cut -f1` MB in the current directory
echo `df --block-size=1000K . | tail -1 | sed -e 's/\ *[0-9]*\ *[0-9]*\ *//' | sed -e 's/\ .*//g'` MB available on disk
cat<<EOF>>filter.kumac
macro filter name
input='gstar'
mess Start with filenames [input].*.fz, converting to [name]
ag/version batch
option stat
option date
option nbox
filecase keep
pwd =\$shell('pwd');
nfiles=\$shell('cat process.list | wc -l | sed -e "s/[\ ]*//g"');

message Starting to process [nfiles]
* trace on
ve/cr runs([nfiles]) I
ve/read runs process.list
ve/pri runs

if (\$Len([name]).eq.0) then
message cannot define current directory in [pwd]
exit
endif
namz=[name]
out =\$env('OUTDIR')
if ([out].ne.'') then
namz = [out]/[name]/[name]
endif

lenb = 1000
message reading
ve/cr id(3) I
* ve/read id N
message reading complete
nt=[nfiles] | total number of files to process
n1=runs(1) | first input file
n2=runs([nfiles]) | last input file
mm = 0 | number of output files
nn = 0 | number of processed files
cnt = 0 | total number of events in this job
cno = 0 | number of events when output has been opened
nev = 0 | number of events in this output
ii = 0 | input active flag
io = 0 | output active flag
len0= 1200 | minimum output file len
len1= [len0]+200 | average output file len - stop at end-of-file
len2= [len1]+200 | maximum output file len - stop always
ni = [n1] | first input file
no = 0 | skip up to this file
nd = [n1] | file to delete
ntrig = 10
*
if (\$fexist(nn).gt.0) then
ve/read id nn
na=id(1); message [na] input files already done
no=id(2); message first input files up to gstar.[no]
mm=id(3); message first output files up to [name].[mm]
mm=[mm]-1;
endif
*
hist = [name].his
if (\$fexist([hist]).gt.0) then
shell mv [hist] old.his
* call HRGET(0,\$quote([hist]),' ')
endif
ghist [hist]
cdir //pawc
mdir cont
if (\$fexist(old.his).gt.0) then
call HRGET(0,\$quote(old.his),' ')
endif

gfile p gstar.[n1].fz
mode control prin 1 hist 0 | simu 2
gexec ../.lib/control.sl
gexec ../.lib/index.sl

message loaded libs

title=merging runs [n1]-[n2] in [name]
fort/file 66 [name].ps; meta 66 -111
next; dcut cave x .1 10 10 .03 .03
Set DMOD 1; Igset TXFP -60; Igset CHHE .35
ITX 5 19.5 \$quote([title])
ITX .5 .1 \$quote([pwd])
*
* do ni = [ni],[n2]
frst=1
ag/version interactive
do iev=1,1000000000000
* new input file ?
if ([ii].eq.0) then
do nfoo=[frst],[nfiles]
ni = runs([nfoo])

file = [input].[ni].fz
filz = [input].[ni].fz.gz
hist = [input].[ni].his
message processing index [nfoo] out of [nfiles]
ve/print runs([nfoo])
*
if (\$fexist([file]).gt.0) then
message loop with [file]
gfile p [file]
if (\$iquest(1).eq.0) then
ii = 1
nn = [nn]+1
if (\$fexist([hist]).gt.0) then
if (\$hexist(-1).eq.0) then
call HRGET(0,\$quote([hist]),' ')
else
call HRGET(0,\$quote([hist]),'A')
endif
endif
call indmes(\$quote([file]))
goto nextf
* iquest:
endif
* fexist:
endif
enddo
goto nexto
endif

nextf:
* new output file ?
if ([io].eq.0) then
mm = [mm]+1
if ([mm].lt.10) then
output=[namz]_0[mm]
else
output=[namz]_[mm]
endif
io = 1
cno = [cnt]
gfile o [output].fzt
iname = [name]_[mm].fzt
call indmes(\$quote([iname]))
endif

* processing next event
call rzcdir('//SLUGRZ',' ')
trig [ntrig]
evt = \$iquest(99)

if (\$iquest(1).ne.0) then
ni = [ni]+1
frst=[frst]+1
ii = 0
endif
if ([ii].eq.0) goto nexto
* get output file length in MB:
cmd = ls -s [output].fzt
len = \$word(\$shell([cmd]))
len = [len]/[lenb]
* mess wrquest len=[len] ii=[ii] evt=[evt]
if ([len].lt.[len0]) goto nextev
if ([len].lt.[len1] .and. [ii].gt.0) goto nextev
if ([len].lt.[len2] .and. [ii].gt.0 .and. [evt].eq.0) goto nextev
* output file done
nexto:
cnt = \$iquest(100)
if ([cnt]<0) then
cnt = 0
endif
nev = [cnt]-[cno]
io = 0
*
if ([nev].gt.0) then
if ([nev].lt.199999) then
* terminate last event, clear memory
call guout
call gtrigc
gfile o
* rename temp file into the final one:
cmv = mv [output].fzt [output]_[nev]evts.fzd
i = \$shell([cmv])
endif
endif
message files inp = [ni] out = [mm] cnt = [cnt] done
*
if ([ii].eq.0) then
nj = [ni] - 1 | this file was finished, ni is NEXT to read
mj = [mm] + 1 | this is next to start write after the BP
message writing breakpoint [nn] [ni] [mj]
ve/inp id [nn] [ni] [mj]
ve/write id nn i6
ntrig = 10
************************************
* moving files to TRASH
while ([nd].lt.[ni]) do
filed = [input].[nd].fz
alrun = *.[nd].*
if (\$fexist([filed]).gt.0) then
shell mv [alrun] trash/
endif
nd = [nd] + 1
endwhile
************************************
else
ntrig = [ntrig] + 1
endif
if ([ni].gt.[n2]) goto alldone
nextev:
enddo

* control histogram
alldone:
if ([nn].eq.[nt]) then
shell touch filter.done
endif
cdir //pawc
tit = files [n1] - [n2] in set [name]
title_global \$quote([tit])
next; size 20.5 26; zone 2 4;
hi/pl 11; hi/pl 12; hi/pl 13; hi/pl 14
if (\$hexist(1).gt.1) then
n/pl 1.ntrack; n/pl 1.Nvertx; n/pl 1.NtpcHit; n/pl 1.Ntr10
endif
swn 111 0 20 0 20; selnt 111
ITX 2.0 0.1 \$quote([pwd])
close 66; meta 0
physi
exit
return
EOF
echo ------------------------------------------------------------------
echo Activating starsim for dataset $1
$STAR_BIN/starsim -w 1 -g 40 -b ./filter.kumac $1
# cleanup
rm ZEBRA.O process.list nn index paw.metafile *.his *.ps filter.done filter.kumac

Running STARSIM within root4star

New event generators are now being implemented in a C++ framework, enabling us to run simulations within the standard STAR, ROOT-based production chain.  Running these generators requires us to migrate away from the familiar starsim interface and begin running simulations in root4star.  Several example macros have been implemented, showing how to run various event generators and how to produce samples of specific particles.  This HOWTO guide illustrates one of these examples.

First, obtain the example macro by checking it out from cvs:
$ cvs co StRoot/StarGenerator/macros
$ cp StRoot/StarGenerator/macros/starsim.kinematics.C .
Running the macro is straightforward.  To generate 100 events, simply do...
$ root4star
root [0] .L starsim.kinematics.C
root [1] int nevents = 100
root [2] starsim(nevents)
This will create an "fzd" file, which can be analyzed with the bfc.C macro as you normally would.

If you're happy with 9 muons per event, thrown with a funky pT and eta distribution, run with the 20012 geometry... then you can use the macro as is.  Otherwise, you'll want to modify things to suit your needs.  Open the macro in your favorite editor (i.e. emacs or vi).  In the "starsim" function, somewhere around line 108, you should see the following lines:
  geometry("y2012");
  command("gkine -4 0");
  command("gfile o pythia6.starsim.fzd");


If you're familiar with the starsim interface, you probably recognize the arguements to the command function.  These are KUIP commands used to steer the starsim application.  You can use the gfile command to set the name of the output file, for example.  The "gkine -4 0" command tells starsim how it should get the particles from the event generator (this shouldn't be changed.)  Finally, the geometry function defined in the macro allows you to set the geometry tag you wish to simulate.  It is the equivalent of the "DETP geom" command in starsim.  So you may also pass magnetic field, switch on/off hadronic interactions, etc.  Any command which can be executed in starsim can be executed using the "command" function.  This enables full control of the physical model, the ability to print out hits, materials, etc... and setup p

Let's take a quick look at the "KINE" event generator and how to configure it.  StarKinematics is a special event generator, allowing us to inject particles into the simulation on an event-by-event basis during a simulation run.  The "trig" function in this macro loops over a requested number of events, and pushes particles.  Let's take a look a this function.
 

void trig( Int_t n=1 )
{
  for ( Int_t i=0; i<n; i++ ) {

    // Clear the chain from the previous event
    chain->Clear();

    // Generate 1 muon in the FGT range
    kinematics->Kine( 1, "mu-", 10.0, 50.0, 1.0, 2.0 );

    // Generate 4 muons flat in pT and eta 
    kinematics->Kine(4, "mu+", 0., 5., -0.8, +0.8 );

    // Generate 4 muons according to a PT and ETA distribution
    kinematics->Dist(4, "mu-", ptDist, etaDist );

    // Generate the event
    chain->Make();

    // Print the event
    primary->event()->Print();
  }
}

The "kinematics" is a pointer to a StarKinematics object. There are three functions of interest to us:

  • Kine -- Throws N particles of specified type flat in pT, eta and phi
  • Dist -- Throws N particles of specified type according to a pT and eta distribution (and optionally phi distribution) defined in a TF1.
  • AddParticle -- Creates a new particles and returns a pointer to it.  You're responsible for setting the identity and kinematics (px, py, pz, etc...) of the particle.
In the example macro, we generate a single muon thrown flat in pT from 10 to 50 GeV, and 1 < eta < 2.  We add to that 4 muons thrown flat 0 < pT < 5 GeV  and |eta|<0.8.  And 4 more muons according to pT and eta distributions defined elsewhere in the code.  After calling "Make" on the big full chain, we print out the resulting event.

Example event record --
[   0|   0|  -1] id=         0    Rootino stat=-201 p=(   0.000,   0.000,   0.000,   0.000;  510.000) v=(  0.0000,  0.0000,   0.000) [0 0] [0 0]
[ 1| 1| 1] id= 13 mu- stat=01 p=( 36.421, -7.940, 53.950, 65.576; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 2| 2| 2] id= -13 mu+ stat=01 p=( -2.836, 3.258, 0.225, 4.326; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 3| 3| 3] id= -13 mu+ stat=01 p=( -1.159, -4.437, -2.044, 5.022; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 4| 4| 4] id= -13 mu+ stat=01 p=( -0.091, 1.695, -0.131, 1.706; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 5| 5| 5] id= -13 mu+ stat=01 p=( 1.844, -0.444, 0.345, 1.931; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 6| 6| 6] id= 13 mu- stat=01 p=( 4.228, -4.467, -3.474, 7.065; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 7| 7| 7] id= 13 mu- stat=01 p=( -0.432, -0.657, 0.611, 1.002; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 8| 8| 8] id= 13 mu- stat=01 p=( -0.633, -0.295, -0.017, 0.706; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]
[ 9| 9| 9] id= 13 mu- stat=01 p=( 2.767, 0.517, 1.126, 3.034; 0.106) v=( 0.0181, -0.0905, 26.381) [0 0] [0 0]

The printout above illustrates the STAR event record.  Each row denotes a particle in the simulation.  The 0th entry (and any entry with a status code -201) is used to carry summary information about the configuration of the event generator.  Multiple event generators can be run in the same simulation, and a Rootino is introduced into the event record to summarize their configuration.  The three columns at the left hold the primary event id, the generator event id, and the idtruth id.  The next column shows the PDG id of the particle, followed by the particle's name.  The particle's staus code is next, followed by the 4-momentum and mass, and the particle's start vertex.  Finally, the last four columns denote the primary ids of the 1st and last mother particle and the 1st and last daughter particle.

The STAR event record is saved in a ROOT file at the end of the simulation run, allowing you to read back both particle-wise and event-wise information stored from the event generator and compare with reconstructed events.  Here, the idtruth    ID of the particle is useful, as it allows you to compare reconstructed tracks and hits with the particle which generated them.

Simulation HOWTOS

STARSIM is the legacy simulation package in STAR, implemented in FORtran, MORtran and utilizing GEANT3 as the concrete MC package.    For documentation on how to run simulations using STARSIM, see

The simulation group is evolving the framework towards using Virtual Monte Carlo.  As a first step, we have implemented a new event generator framework which will be compatible with the future VMC application.  The new framework allows us to run jobs within root4star.  In order to run simulations in the new framework, see

  • Running STARSIM within root4star

StMc package

,,,

The STAR Simulation Framework

Outline

  1. Introduction
  2. Quick Start
  3. Primary Event Generation
  4. Geometry Description
  5. Particle Transport and Detector Simulation
  6. Digitization / Slow Simulation
  7. The Truth about Event Records
  8. Reconstruction

Introduction


STAR's simulation infrastructure is built around the GEANT 3 simulation package.  It is implemented in FORtran, augmented by the MORtran-based AgSTAR preprocessor language, with a ROOT and C++ interface which allows it to be integrated with our standard "big full chain" software.  The legacy documentation pages describe how to run starsim as a standalone application.  The purpose of this document is to describe how to run simulations using the modern C++ interface, and to serve as a starting point for extending the simulation infrastructure with additional geometry modules, event generators, and slow simulators.

This document assumes familiarity with:
  1. C++
  2. ROOT
  3. The STAR Framework

Quick Start

  1. Your First Simulation Job
  2. Reconstructing the Events
  3. idTruth and qaTruth
  4. Taking it Further
Running your analysis code over Monte Carlo data samples is generally a three step process.  First you'll need to generate your Monte Carlo events by running the simulation package.  Then you'll pass these events through the reconstruction software to convert MC hits to "real" ones, then perform track reconstruction, matching to TOF/MTD/etc... and creating calorimeter hits.  Finally you will want to run your analysis code on the output, possibly even looking at the Monte Carlo Truth tables in order to better understand the efficiency, purity and resolution of your reconstruction codes.

A. Your First Simulation Job


We're going to start with a simple example:  starsim.kinematics.C.  This is an example macro which runs under root4star, and generates a sample of muons and neutral pions.  Start by checking the code out from CVS. 
$ cvs co StRoot/StarGenerator/macros/starsim.kinematics.C                 # check out the macro
$ ln -s StRoot/StarGenerator/macros/starsim.kinematics.C starsim.C        # create a link named starsim.C
$ root4star starsim.C                                                     # run the code using STAR's version of ROOT
You're going to see alot of output here, but in the end you'll have two output files:  starsim.kinematics.root and starsim.kinematics.fzd
$ ls
starsim.kinematics.fzd
starsim.kinematics.root
starsim.C
StRoot/

These two files are the so called "zebra" file (.fzd), containing the Monte Carlo hits, geometry and other associated event information, and the event generator record (.root), containing a ROOT TTree which saves all of the particles generated by the primary event generator. 

B. Reconstructing the Events


Once we have the output files, it's time to run them through the reconstruction chain.  STAR's reconstruction code is steered using the "big full chain" macro bfc.C.  For most jobs you'll want to provide BFC three arguements:  the number of events to produce, the set of chain options to run, and an input file.  For more complicated tasks you're encouraged to ask questions on the STAR software list.

Below you'll find an example macro which runs the big full chain.  We're going to run a limited set of STAR software to begin with.  Start by looking at line 12.  This is where the big full chain is called.  As I noted above, it takes three arguements.  The first is the number of events to process... coincidentally, 10 is the number of events which starsim.kinematics.C produces by default.  Then it takes two strings as input.  The first is the set of chain options we want to run.  It's a long list, so I've broken things down across several lines. 

Line 5 specifies the geometry tag to use.  Generally the format is the letter "r" followed by a valid STAR Geometry in simulation & reconstruction.
Line 6 specifies the geometry model to use.  Generally you need to specify both "agml" and "UseXGeom" here.  (More on this later).  BTW, did you notice that capitalization is ignored? 
Line 7 tells the big full chain that the input will be a zebra file.  This is our standard for now.
Line 8 sets up the TPC simulator.  We perform our digitization of the geant hits in the reconstruction chain.  This is where the TPC hits are converted to ADC values used as input to the reconstruction codes.
Line 9 sets up the track finding and reconstruction codes.  We're using "sti" and "ittf" here.
Line 10 most STAR analyses use the micro DST.  This flag creates it.
Line 11 the "fzd" file contains an event record, which associates MC hits with generated tracks.  This will allow us to (much later) associate the reconstructed tracks with the true Monte Carlo particles from which they came.
$ emacs runBfc.C                    # Feel free to use your favorite editor instead of emacs
0001 | void runBfc() {
0002 |   gROOT->LoadMacro("bfc.C");                  // Load in BFC macro
0003 |   TString _file = "kinematics.starsim.fzd";   // This is our input file
0004 |   TString _chain;                             // We'll build this up
0005 |   _chain += "ry2012a ";                       // Start by specifying the geometry tag (note the trailing space...)
0006 |   _chain += "AgML USExgeom ";                 // Tells BFC which geometry package to use.  When in doubt, use agml.
0007 |   _chain += "fzin ";                          // Tells BFC that we'll be reading in a zebra file.
0008 |   _chain += "TpcFastSim ";                    // Runs TPC fast simulation
0009 |   _chain += "sti ittf ";                      // Runs track finding and reconstruction using the "sti" tracker
0010 |   _chain += "cmudst ";                        // Creates the MuDst file for output
0011 |   _chain += "geantout ";                      // Saves the "geant.root" file
0012 |   bfc(10, _chain, _file );                    // Runs the simulation chain
0013 | }
ctrl-x ctrl-s ctrl-x ctrl-q          # i.e. save and quit
$ root4star runBfc.C                 # run the reconstruction job
$ ls -l
...

If all has gone well, you now have several files in your directory including the MuDst which you'll use in your analysis.

$ ls -1 *.root
kinematics.geant.root
kinematics.hist.root
kinematics.MuDst.root
kinematics.runco.root
kinematics.starsim.root


C. idTruth and qaTruth

During the first phase of the simulation job we had full access to the state of the simulated particles at every step as they propagated through the STAR detector.  As particles propagate through active layers, the simulation package can register "hits" in those sensitive layers.  These hits tell us how much energy was deposited, in which layer and at what location.  They also save the association between the particle which deposited the energy and the resulting hit.  This association is saved as the "idTruth" of the hit.  It corresponds to the unique id (primary key) assigned to the particle by the simulation package.  This idTruth value is exceedingly useful, as it allows us to compare important information between reconstructed objects and the particles which are responsible for them.

Global and Primary tracks contain two truth variables:  idTruth and qaTruth.  idTruth tells us which Monte Carlo track was the dominant contributor (i.e. provided the most TPC hits) on the track, while qaTruth tells us the percentage of hits which thath particle provided.  With idTruth you can lookup the corresponding Monte Carlo track in the StMuMcTrack branch of the MuDst.  In the event that idTruth is zero, no MC particle was responsible for hits on the track. 
With the MC track, you can compare the thrown and reconstructed kinematics of the track (pT, eta, phi, etc...).

Primary vertex also contains an idTruth, which can be used to access the Monte Carlo vertex which it corresponds to in the StMuMcVertex branch of the MuDst.

D. Taking it further

In starsim.kinematics.C we use the StarKinematics event generator, which allows you to push particles onto the simulation stack on an event-by-event basis.  You can throw them flat in phase space, or sample them from a pT and eta distribution.  These methods are illustrated in the macro, which throws muons and pions in the simulation.  You can modify this to suit your needs, throwing whatever particles you want according to your own distribtions.  The list of available particles can be obtained from StarParticleData.
 

$ root4star starsim.C\(0\)
root [0] StarParticleData &data = StarParticleData::instance();
root [1] data.GetParticles().Print()

Additionally, you can define your own particles.  See starsim.addparticle.C.


Primary Event Generation

Geometry Definition

Running the Simulation

Event Reconstruction

The Truth about Event Records

StMc package

   StMc package create StMcEvent structure and fills it by Monte-Carlo information. Then  create StMiniMcEvent structure
which contains both, MonteCarlo & Reconstruction information. Then provide matching MonteCarlo & Reconstruction info.
It allows user to estimate quality of reconstruction and reconstruction efficiency for different physical processes.
Actually, StMcEvent is redundunt, and exists by historical reasons.
StMc consists of:

  • StMcEvent - structure with MonteCarlo(Simu) information;
  • StMiniMcEvent - structure with both Simu & Reco info;
  • StMcEventMaker - maker creates and fill StMcEvent Simu info;

Attic

Archive of old Simulation pages.

Event Generators, event mixing

B0/B+ simulation and event mixing

Decays

  • weight 28% : B0-> D- + (e+) + (nu)
  • weight 72% : B0-> D*(2010) + e + nu
    • D*(2010) -> (D0) + (pi+) b.r.69%
    • D*(2010) -> (D+) + (pi0) b.r.31% neglect D*->gamma
  • weight 25% : B+ -> (D0bar) + (e+) + nu
  • weight 75% : B+ -> D*bar(2007) + (e+) + nu
    • D*(2007) -> D0+ (pi0) b.r.62%
    • D*(2007) -> D0+ (gamma) b.r.38%

Hijing

To use Hijing for simulation purposes, one must first run Hijing proper and generate event files, then feed these data to starsim to complete the GEANT part.

The Hijing event generator codes and makefile can be found in the STAR code repository at the following location:$STAR/pams/gen/hijing_382. Once built, the executable is named hijjet.x. The input file is called hijev.inp and should be modified as per user's needs. When the executable is run multiple times in same directory, a series of files will be produced with names like evgen.XXX.nt, where XXX is an integer. The format of the file is PAW Ntuple. The starsim application is equipped to read that format as explained below. If a large number of events are needed, a request should be made to the STAR simulation leader or any member of the S&C.

Listed below is the KUMAC macro that can be used to run your own GEANT simulation with pre-fabricated Hijing events . Unlike the Pythia simulation, events aren't generated on the fly but are read from an external file instead. Look at the comments embedded in the code. Additional notes:

  • don't forget to seed the random number generator if you'll be doing a series of runs
  • make sure you specify the correct geometry tag
  • specify a different output file for each run
  • the location of the input file (current directory) and the name (evgen.1.nt) are given as an example
  • you can browse the directory /star/simu/evgen to look at what input Hijing files are already available
  • the number of triggers on the bottom of the macro can be set to anything, just remember that the resulting files can be large and unwieldy if that number is too large. As a rule of thumb, we usually don't go over 500 events per file in production for min-bias AuAu, and 100 event for central gold
gfile o my_hijing_file.fz
detp geom y2006
make geometry
gclose all
* define a beam with 100um transverse sigma and 60cm sigma in Z
vsig  0.01  60.0
* introduce a cut on eta to avoid having to handle massive showers caused by spectators
gkine -1 0 0 100 -6.3 6.3 0 6.3 -30.0 30.0
gexec  $STAR_LIB/gstar.so
us/inp hijing evgen.1.nt
* seed the random generator
rndm 13 17
* trigger - change to trigger the desired number of times
trig 10

Pythia

Introduction

There are two ways, which are slightly different, to run the Pythia event generator in the context of the Starsim application. In the original (old) design, the dynamic library apythia.so served both as an adapter and a container for the standard Pythia library that would typically come with a CERNLIB distribution. The problem with this approach is of course that Pythia in itself is not a static piece of software and receives periodic updates. It is difficult or impossible, therefore, to modify the apythia.so component without affecting, in one way or another, various analyses as the consistency of the code is broken at some point.

It possible, however, to refactor the Pythia adaptor in such a way that the Pythia library proper can be loaded separately. This gives the user the ability to choose a specific version of the Pythia code to be run in their simulation. Different users, therefore, can use different versions of Pythia concurrently in Starsim, which is in everybody's interest. The thus modified wrapper was given the mneumonic name bpythia.so (it should be easy to memorize since "b"-pythia follows "a"-pythia). We have also decided the freeze the Pythia version linked into apythia.so at 6.205, and select subsequent versions bpythia.so as explained on the bottom of this page.

In the following, we present both the "old way" of running Pythia (i.e. tied to a specific version) and the new one (whereby the version can be requested dynamically at run time).

Using Pythia 6.205

Listed below is the KUMAC macro that can be used to run your own Pythia simulation, specifically utilizing version 6.205 of the Pythia code and without the ability to switch. This would be fine for most STAR applications at the time of this writing (mid-2007). Please pay attention to the comments embedded in the code. Additional notes:
  • the script below explicitely refers to apythia.so which contains Pythia 6.205
  • don't forget to seed the random number generator if you'll be doing a series of runs
  • make sure you specify the correct geometry tag
  • specify a different output file for each run
  • pay attention to the physics parameters used in the simulation; you will need to consult the Pythia manual for meaning fo those
  • the number of triggers on the bottom of the macro can be set to anything, just remember that the resulting files can be large and unwieldy if that number is too large. As a rule of thumb, we usually don't go over 5k event per file in production
gfile o my_pythia_file.fz
detp geom y2006
make geometry
gclose all
* define a beam with 100um transverse sigma and 60cm sigma in Z
vsig  0.01  60.0
* Cut on eta (+-6.3) to avoid having to handle massive showers caused by the spectators
* Cut on vertex Z (+-30 cm)
gkine -1 0 0 100 -6.3 6.3 0 6.29 -30.0 30.0
* load pythia
gexec $STAR_LIB/apythia.so
* specify parameters
ENER 200.0     ! Collision energy
MSEL 1         ! Collision type
MSUB (11)=1    ! Subprocess choice
MSUB (12)=1
MSUB (13)=1
MSUB (28)=1
MSUB (53)=1
MSUB (68)=1
*
* Make the following stable:
*
MDCY (102,1)=0  ! PI0 111
MDCY (106,1)=0  ! PI+ 211
*
MDCY (109,1)=0  ! ETA 221
*
MDCY (116,1)=0  ! K+ 321
*
MDCY (112,1)=0  ! K_SHORT 310
MDCY (105,1)=0  ! K_LONG 130
*
*
MDCY (164,1)=0  ! LAMBDA0 3122
*
MDCY (167,1)=0  ! SIGMA0 3212
MDCY (162,1)=0  ! SIGMA- 3112
MDCY (169,1)=0  ! SIGMA+ 3222
MDCY (172,1)=0  ! Xi- 3312
MDCY (174,1)=0  ! Xi0 3322
MDCY (176,1)=0  ! OMEGA- 3334
* seed the random generator
rndm 13 19
* trigger - change to trigger the desired number of times
trig 10

Specifying the Pythia version dynamically

In addition to the "frozen" version 6.205 which can be used as explained above, there is currently one more version that can be loaded, namely 6.410. Going forward, more versions will be added to the code base and to the collection of STAR libraries, as needed.

In order to use version 6.410, the user needs to simply replace the following line in the above script
gexec $STAR_LIB/apythia.so
With:
gexec $STAR_LIB/libpythia_6410.so
gexec $STAR_LIB/bpythia.so

The Magnetic Monopole in STAR

Introduction

It is possible to simulate the production and propagation of the magnetic monopoles in the STAR experiment, using a few modification in the code base of GEANT 3.21, and in particular in our GEANT-derived application, the starsim. Our work is based on a few papers, including:

The flow of the GEANT code execution is illustrated by the following diagrams from the above publication:

 

 

 

First Results

As as demonstration of principle, we present here a few Starsim event display pictures. First, we propagate 12 magnetic monopoles of varying momenta, in the STAR detector:

 

 

Now, let's take a look at a minimum bias gold-gold event that contains a pair of magnetic monopoles:

 

 

Salient features can already be seen in these graphics: large dE/dx losses and characteristic limit on the maximum radius of the recorded monopole track (this is due to the fact that the trajectory of the mm is not helix-like, but rather parabole-like). Now, lets take a look at the phi distribution of the hits, for central and peripheral gold-gold events containing monopoles:

 

 

 

Again, the rather intuitive feature (large peaks in phi due to a very large dE/dx produced by the monopoles) is obviously borne out in the simulation.

 

This is work in progress and this page is subjec to updates.

Grid-friendly Starsim production scripts

Since the production activity of STAR is migrating to, and eventually will end up running mostly in the Grid environment, this necessitates modification (which often means simplification) of the production scripts we use when running on a local or another "traditional" Unix farm facility. Here is an example of the script we have successfully used to run a Pythia simulation on the Grid (utilizing the Fermilab facility), as well as the SunGrid, with cosmetic modifications.

A few things worth noting:

  • The bulk of the script has to do with establishing the Pythia settings which are often required in the simulations requested by the Spin PWG; the starsim proper part is located on top is is uncomplicated; it invloves dynamic loading of the necessary libraries, setting up the beam interaction diamond parameters ets
  • The script needs the contents of the tarball (listed on the bottom of the page) located in its working directory; this "payload" contains the Starsim executable as well as a few shared libraries and accessory scripts necessary for its function. To be able to run on the Grid, therefore, on needs to
    • Transfer the tarball and make provisions for extraction of the files
    • Transfer the script below and configure it for submission with a unique serial number (any integer, really)
  • The script takes only one argument, which is the serial number of the run. The rest of the run parameters are encoded in its body, which minimizes the chances of human error when submitting a large number of scripts, potentially for many different datasets
  • The random number generator is seeded with the serial run number and with the Unix process ID of the script on the target machine, which for all intents and purposes guarantees the uniqueness of a sequence in each run
#!/usr/bin/ksh
echo commencing the simulation
export STAR=.
echo STAR=$STAR
#
run=$1
geom=Y2006C
ntrig=2000
diamond=60
z=120
# >> run.$run.log 2>&1
node=`uname -n`
echo run:$run geom:$geom ntrig:$ntrig diamond:$diamond z:$z node:$node pid:$$
./starsim -w 0 -g 40 -c trace on .<<EOF
trace on
RUNG $run 1 $$
RNDM $$ $run
gfile o gstar.$run.fz
detp geom $geom
vsig 0.01 $diamond
gexec $STAR/geometry.so
gexec $STAR/libpythia_6410.so
gexec $STAR/bpythia.so
gclose all
gkine -1 0 0 100 -6.3 6.3 0 6.28318 -$z $z
ENER 200.0
MSEL 1
CKIN 3=4.0
CKIN 4=5.0
MSTP (51)=7
MSTP (81)=1
MSTP (82)=4
PARP (82)=2.0
PARP (83)=0.5
PARP (84)=0.4
PARP (85)=0.9
PARP (86)=0.95
PARP (89)=1800
PARP (90)=0.25
PARP (91)=1.0
PARP (67)=4.0
MDCY (102,1)=0 ! PI0 111
MDCY (106,1)=0 ! PI+ 211
MDCY (109,1)=0 ! ETA 221
MDCY (116,1)=0 ! K+ 321
MDCY (112,1)=0 ! K_SHORT 310
MDCY (105,1)=0 ! K_LONG 130
MDCY (164,1)=0 ! LAMBDA0 3122
MDCY (167,1)=0 ! SIGMA0 3212
MDCY (162,1)=0 ! SIGMA- 3112
MDCY (169,1)=0 ! SIGMA+ 3222
MDCY (172,1)=0 ! Xi- 3312
MDCY (174,1)=0 ! Xi0 3322
MDCY (176,1)=0 ! OMEGA- 3334
trig $ntrig
exit
EOF

The contents of the "payload" tarfile:

143575 2007-05-31 18:02:47 agetof
65743 2007-05-31 18:02:39 agetof.def
44591 2007-05-31 19:05:34 bpythia.so
5595692 2007-05-31 18:03:10 geometry.so
183148 2007-05-31 18:03:15 gstar.so
4170153 2007-05-31 19:05:27 libpythia_6410.so
0 2007-05-31 18:00:06 StarDb/
0 2007-05-31 18:00:59 StarDb/StMagF/
51229 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_negative_2D.dat
2775652 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_negative_3D.dat
51227 2007-05-31 18:00:57 StarDb/StMagF/bfield_full_positive_2D.dat
2775650 2007-05-31 18:00:58 StarDb/StMagF/bfield_full_positive_3D.dat
51227 2007-05-31 18:00:58 StarDb/StMagF/bfield_half_positive_2D.dat
2775650 2007-05-31 18:00:58 StarDb/StMagF/bfield_half_positive_3D.dat
1530566 2007-05-31 18:00:59 StarDb/StMagF/boundary_13_efield.dat
51231 2007-05-31 18:00:59 StarDb/StMagF/const_full_positive_2D.dat
1585050 2007-05-31 18:00:59 StarDb/StMagF/endcap_efield.dat
1530393 2007-05-31 18:00:59 StarDb/StMagF/membrane_efield.dat
15663993 2007-05-31 18:03:31 starsim
36600 2007-05-31 18:03:37 starsim.bank
1848 2007-05-31 18:03:42 starsim.logon.kumac
21551 2007-05-31 18:03:48 starsim.makefile

Production overview

As of spring of 2007, the Monte Carlo production is being run on three different platforms:

  • the rcas farm
  • Open Science Grid
  • SunGrid

 

Miscellaneous scripts

a

VMC

VMC C++ Classes

StarVMC/StarVMCApplication:

  • StMCHitDescriptor
  • StarMCHits
    • Step
  • StarMCSimplePrimaryGenerator

 

Example of setting the input file: StBFChain::ProcessLine ((StVMCMaker *) 0xaeab6f0)->SetInputFile("/star/simu/simu/gstardata/evgenRoot/evgen.3.root");

In general, StBFChain sets various attributes of the makers.

 

New chain options must be added in BigFullChain.h