Run 13 Embedding

Obsolete 18 April 2020

REWRITE By Nick Lukow and later by Amilkar Quintero (3 October 2018)

 
NOTE: ensure you are in dev by using the command >starver SL16g_embed
working directory: /star/u/aquinter/Jets2013/Embedding

0. Retrieve the zerobias .daq files from HPSS. Use the macro /star/u/aquinter/Embedding/Embedding2/HPSS/GetZBfiles.sh
 

1. First the vertices need to be generated by running the script runGen.sh which
uses the GenerateVertex.C macro (for submitting the job, you will need the
QAgen.job file)
 
-to run the script use the command  >sh runGen.sh <run>
-where <run> is the number of the run in the runlist (if there
are 10 runs, the arguement should be a number between 1 and 10)
 
-Before running, edit the script runGen.sh:
-in the line "for runnumber in 'cat ..." edit the path to be
the path to the runlist you are using
-edit nevents=*** to the desired number of event vertices
-edit the outdir to your desired output directory
-edit the line that begins root4star -q -b /star/u/* to the
directory that contains the macro GenerateVertex.C 
-Before running edit the GenerateVertex.C macro:
-edit nevents to be the desired amount
-to submit a job use the command >condor_submit QAgen.job
 
-Before submitting the job edit the QAgen.job file:
-edit the Initialdir to be the directory from which you are
running the script
-edit the Output, Error, and Log values to be in the path of
your choice (make a condor folder in your directory and use that path)
-edit the Notify_user value to your email (though this is seldom used)
-edit the Queue value to be the number of runs in the list of runs referenced in the renGen.sh script
 
 
2. To generate the MC simulation from the vertices generated in step one, you
will need the script runKumac.sh which uses the simJetRequest.kumac macro (for
submitting the job, you will need the QAKumac.job file)
 
-to run the script use the command >sh runKumac.sh <run>
-where <run> is the number of the run in the runlist
 
-Before running, edit the script runKumac.sh:
-in the line "for runnumber in..." edit the file to be the runlist you are using
-edit the outdir to be the directory you wish the output to be placed in
-in the line "starsim -w 0 -b..." edit the directory to be the path to the simJetRequest.kumac file
 
-to submit a job use the command >condor_submit QAKumac.job
 
-Before submitting the job, edit the QAKumac.job file:
-same as editing the job file for QAgen
-one difference: for Queue number, multiply the number of runs
by 13 (there are 13 bins for each run)
 
2.1 After Running step 2,  move the condor files to another folder called
fzdlog by typing the command >mv condor fzdlog
This will be used by the python script CalcXsec.py to calculate weights.
 
----------------------------------------
NOTE ON JOBS:
-to check on the status of your jobs use the command >condor_q -submitter <username>
-Where <username> is your username
-to remove jobs from the queue us the command >condor_rm -name
<nodename> <username>
-Where <nodename> is the name of the node you are on
(can be found at the top of the window or at the beginning of the command
line. Typically of the form  rcas####)
 
If you are submitting a large job have the condor files written to the scratch
directory (the resulting files will take up a lot of space, so you dont want
them in your working directory)
---------------------------------------
 
3. To produce the MuDst files you will need the script
runEmb.sh which uses the macro bfcMixer_Jet.C
 
-to run the script use the command >sh runEmb.sh <run> 
 
-Before running the script...
-edit the path in the line that begins "for runnumber in ..."
to point to the run list you are using 
-edit the outdir to the desired output path
-edit the paths found on the lines that begin "ln -s" to point
to the directories StRoot and .sl73_gcc485 (ensure these are in your directory)
-if you don't have StRoot or .sl73_gcc485 in your directory
use the following commands
- first you need the correct library >starver dev (for
 the latest)
- then get the StRoot packages you'll need
>cvs co StRoot/StTriggerUtilities 
                                 For Period 2 comment line 526 of StTriggerUtilities/Eemc/StEemcTriggerSimu.cxx (sehttp://www.star.bnl.gov/HyperNews-star/protected/get/starspin/6169.html)
- then to compile it use the command >cons
-edit the path found on the line that begins "root4star -q -b"
to point to the macro bfcMixer_Jet.C
-edit the nevents to reflect the number of events
 
-to submit a job use the command >condor_submit QA.job
 
-Before submitting the job edit the QA.job file:
-same as editing for Part 2 (including the scaled Queue
number)
-make any necessary output folders (condor/)
 
4.  To get the jet candidates from reconstruction you will need the runJet.sh
script which uses the RunJetFinder2013emb.C macro
 
-to run the script use the command >sh runJet.sh <run>
-Before running the script
-edit the indir/outdir/list paths to the appropriate values
-edit the nevents to the appropriate value
-use the command >cons (might not be necessary)
 
-to submit a job use the command >condor_submit QA.job
 
-Before submitting a job edit the QA.job file:
-same as editing for part 2
-make any necessary output folders (condor/)
 
5. To get the actual jets you will need runInc.sh script, the
IncJetTreeMakerRun13emb_AQv1.cxx macro, and the compileAQ script 
 
-You will need packages (from the working directory IncJets/)
-StRoot/StSpinPool/StJetEvent
-StRoot/StSpinPool/StJetSkimEvent
-StRoot/StForwardDijetEvent
-compile them using >cons
 
-Edit the compileAQ script by changing the pathnames to paths in your
directory
-Edit the runInc.sh script
-change the nevents
-change pathnames to paths in your directory
-Edit the macro IncJet...
-Edit the pathnames in the include statements to paths in your
directory
-compile the macro by running the compileAQ script
 
-Run the script by using the command >sh runInc.sh <run>
 
-edit the QA.job file (same as before)
-submit jobs same as before
 
6. To Generate the histograms you will need the following (from the working directory Histograms/)
-runHisto.sh
-runComb.sh
-MakeHistoUEEmb.C
-CombinePtBinsV7.C
These will be used to generate histograms from the files created earlier
 
First you will make histograms for every run/ptbin file. This is done using
the runHisto.sh script which uses the MakeHistoUEEmb.C macro
-to run, use the command >sh runHisto.sh <run>
-before running edit the runHisto.sh script:
-edit nevents
-edit indir and home pathnames to appropriate paths in
your directory
-edit the path to the list file
-before running you will need to create a folder called ProductionHist
in the same directory as the script (the home path from above). This is where
the output will be stored.
 
-to submit it as a job use the command >condor_submit QAHisto.job
-Before submitting as a job make sure to edit the QAHisto.job file
-edit pathnames
-edit Queue count
-make a condor folder for the files to be written to
 
To combine the histograms (merging histograms from same run but different pt
bins) you will need the runComb.sh script which uses the
CombinePtBinsV5.C macro
 
-to run use the command >sh runComb.sh
-Before running:
-edit the runComb.sh script so that the path to the run list
is correct
-edit the CombinePtBinsV5.C macro:
-paths on lines 99 and 171
-Bin Weights on line 6 need to match the results of
the CalcXsec.py which should be run after step 2. [not always necessary]
-Create a folder called ProdctionComb for output (match
pathname on line 171)
After running this you should have a root file for each run inside the
ProductionComb folder. These can be combined into a single root file using the
command >hadd <filename>.root ProductionComb/*  (this merges all the root
files in the ProductionComb folder)
 
Once you have this combined root file, you can use the Plot_* macros to
analyze them