Software

Software

StMuRpsUtil - Roman Pot data analysis utilities (afterburner)

StMuRpsUtil (under development, for testing purposes only!!!)

Should you have any questions/comments/remarks regarding this module please contact
rafal.sikora@fis.agh.edu.pl.

1. What is StMuRpsUtil
2. Structure
3. Utilities
4. How to use
5. Useful links


What is StMuRpsUtil
StMuRpsUtil is user-friendly utility class which provides a set of post-processing corrections (afterburner) to Roman Pot data stored in StMuRpsCollection class. It has built-in functionalities which expand standard Roman Pot data collection.


Structure
StMuRpsUtil is a ROOT-based class intended to work in the STAR computation environment, as well as local environments e.g. standalone machines. A typical STAR "Maker" format (inheritance from StMaker class) was abandoned in order to gain possibility to run the same code on MuDST files and other storage formats e.g. private picoDST files. The only limitation/requirement is, that Roman Pot data has to be stored in the StMuRpsCollection class.

Usage of StMuRpsUtil invloves creation of single instance of the class at the beginning of analysis, and invocation of StMuRpsUtil::process() and StMuRpsUtil::clear() methods at the beginning and ending of event analysis, respectively. StMuRpsUtil::process() returns pointer to StMuRpsCollection2 class, a mirror class of standard StMuRpsCollection, which contains RP data post-processed using final calibartions. All elements of StMuRpsCollection2 can be accessed in the very same way as of StMuRpsCollection class.


Utlities
StMuRpsUtil provides the following corrections to data:
  • run-based alignment calibration
  • (to be implemented) run-based time slewing corrections
  • (to be implemented) hot strips removal

The following functionalities are available to user:
  • user can set position of the vertex that is used in reconstruction of proton kinematics
    StMuRpsUtil::updateVertex(double x, double y, double z)
    This method should be invoked before StMuRpsUtil::process(). The unit of arguments is meter.
  • (to be implemented) user can select type of selection criteria (loose, medium, tight): only proton tracks passing cuts at selected level are present in the tracks collection


How to use
 MuDST analysis (working example: /star/u/rafal_s/StMuRpsUtil_tutorial/)
  1. Setup environment to SL16c or newer.
    starver SL16c
    Make sure you have the latest definitions of Roman Pot data classes in your StRoot.

  2. Download StMuRpsUtil package from repository.
    cvs co offline/users/rafal_s/StMuRpsUtil
  3. Put downloaded StMuRpsUtil catalogue under StRoot path in your analysis directory.
    mv offline/users/rafal_s/StMuRpsUtil myAnalysisDir/StRoot/.
  4. Edit setup.h file (myAnalysisDir/StRoot/StMuRpsUtil/setup.h) so that only the following line is uncommented.
    #define RUN_ON_MUDST // set if afterburner is used on MuDST
  5. Edit the header file of your analysis maker class.
    Add declaration of StMuRpsUtil class before definition of your analysis maker class, and add pointer to StMuRpsUtil object as the element of your analysis maker class.
    /*...*/
    class StMuRpsUtil;
    /*...*/

    class MyAnalysisMakerClass: public StMaker{
    /*...*/
    StMuRpsUtil* mAfterburner;
    /*...*/
    }
  6. Edit the implementation file of your analysis maker class.
    Include StMuRpsUtil and StMuRpsCollection2 headers at the beginning.
    /*...*/
    #include "StMuRpsUtil/StMuRpsUtil.h"
    #include "StMuRpsUtil/StMuRpsCollection2.h"
    /*...*/
    In the analysis maker constructor, create StMuRpsUtil object passing as an argument pointer to StMuDstMaker.
    MyAnalysisMakerClass::MyAnalysisMakerClass(StMuDstMaker* maker): StMaker("MyAnalysisMakerClass"){
      /*...*/
      mAfterburner = new StMuRpsUtil(maker);
    }
    At the beginning of MyAnalysisMaker::Make() method in your analysis maker class, invoke StMuRpsUtil::process() which will provide you post-processed RP data collection. Don't forger to call StMuRpsUtil::clear() at the end of MyAnalysisMaker::Make().
    Int_t MyAnalysisMaker::Make( ){
       /*...*/
       StMuRpsCollection2* muRpsColl = mAfterburner->process(); // <-- muRpsColl can be used to get corrected proton tracks etc. 
    /* here analysis of an event */
    mAfterburner->clear(); // <-- critical!!! return kStOK; }
  7. Download RP data calibration files from http://www.star.bnl.gov/~rafal_s/protected/rpsCalibrations2015.tar.gz, unpack it and put exatracted catalogues under myAnalysisDir (you should then have myAnalysisDir/Alignment etc.).
     
  8. Add the following line:
    gSystem->Load("StMuRpsUtil.so");
    to the macro which starts the analysis chain. It ensures that afterburner libraries are accessible.
     
  9. Edit you job submission XML file so that directories with calibration files extracted from rpsCalibrations2015.tar.gz are included into sandbox.
    <SandBox installer="ZIP">
            <Package>
                    <!-- Any other files.... -->
                    <File>file:./Alignment</File>
                    <File>file:./HotStrips</File>
                    <!--        etc.         -->
            </Package>
    </SandBox>
After the above steps your code should be compilable and should make use of StMuRpsUtil afterburner in MuDST analysis.

If you find any problems using StMuRpsUtil (code does not compile or crashes at execution) you are kindly requested to report it to developers. We kindly ask to not edit the StMuRpsUtil code on your own.

 picoDST analysis (Krakow format) (description to be added)



Useful links
StMuRpsUtil repository in STAR CVS: http://www.star.bnl.gov/cgi-bin/protected/cvsweb.cgi/offline/users/rafal_s/StMuRpsUtil/
Working example of analysis using StMuRpsUtil: /star/u/rafal_s/StMuRpsUtil_tutorial/
Roman Pot data calibration files (run 2015): http://www.star.bnl.gov/~rafal_s/protected/rpsCalibrations2015.tar.gz
StMuRpsCollection documentation (write-up): https://drupal.star.bnl.gov/STAR/system/files/RomanPotsInStEvent_0.pdf
StMuRpsCollection documentation (doxygen): http://www.star.bnl.gov/webdata/dox/html/classStMuRpsCollection.html
Roman Pot alignment description: to be added

Analysis code for UPC picoDST (Krakow format)

Analysis code for UPC picoDST (Krakow format)

Should you have any questions/comments/remarks please contact
rafal.sikora@fis.agh.edu.pl or leszek.adamczyk@agh.edu.pl.

1. Introduction
2. 
Code structure
3. How to run
4. Configuration file (options)
5. Useful links


Introduction
In this page you can find a set of instructions that will enable you to develop, run, and share ROOT-based C++ code for the picoDST analysis created and maintained by the Krakow group of the UPC PWG. The code is shared beetween all data analyzers via CVS repository http://www.star.bnl.gov/cgi-bin/protected/cvsweb.cgi/offline/UPC/.

Code structure
 Shared files - can be editted by all users:
          rpAnalysis.cpp - analysis class (definitions)
          rpAnalysis.hh - analysis class (header)
          config.txt - configuration file
 Core files - do not edit those files and directories:
          runRpAnalysis - launching script (recommended to use)
          rpAnalysisLauncher.C - launching script
          clearSchedulerFiles.sh - utility macro which removes files created by STAR scheduler
          picoDstDescriptionFiles - folder with a core code describing picoDST content etc.

The skeleton of the analysis class rpAnalysis (inherits from TSelector) was created with ROOT built-in method MakeSelector() (some more information about MakeSelector() can found here).
When the analysis starts, methods rpAnalysis::Begin() and rpAnalysis::SlaveBegin() are invoked (right place to create histograms etc.). Next, the rpAnalysis::Process() is called for each event in picoDST tree - you can put here your selection algorithms, filling histograms, and so on. After all events in picoDST are processed, methods rpAnalysis::SlaveTerminate() and rpAnalysis::Terminate() are invoked, where e.g. output file can be written.

Data of single event accessible in rpAnalysis::Process() is stored by particle_event object. Click here to see all elements of this class.

Running an analysis code should be launched by runRpAnalysis script (executable). The script can be run with one argument which is the name of the configuration file with a definition of the trigger that you would like to analyze and some analysis options (they can be used to control which part of a code should be executed). If script is run without any arguments, a configuration file config.txt is used to launch the analysis.


How to run
 Preparation of the analysis environment (first run)
  1. Setup environment to stardev.
    stardev
  2. Create and enter a directory where you want to work with the UPC analysis code. Let's denote it MY_PATH.
    mkdir MY_PATH
    cd MY_PATH
  3. Download analysis code from repository. Enter the code directory. 
    cvs co offline/UPC
    cd offline/UPC
  4. Edit the configuration file config.txt. Especially important is to provide valid paths under options CODE_DIRECTORY and OUTPUT_DIRECTORY (absolute paths). You are encouraged to set the path for analys output outside the offline/UPC.
    CODE_DIRECTORY=/absolute/path/to/MY_PATH/offline/UPC
    OUTPUT_DIRECTORY=/absolute/path/to/MY_PATH/output
    
    OUTPUT_DIRECTORY does not have to exist, in such case it will be automatically created by the analysis launching script.

  5. Now you are prepared to run the analysis. For the first execution do not edit SINGLE_RUN option in the configuration (leave it set to "yes"). To start analysis simply type
    runRpAnalysis
    If there are any problems with the configuration file, e.g. wrong data directory etc., you will receive appropriate message(s) in the terminal.
    If no problems are found by the launching script, you should see a ROOT being started and displaying messages about compilation progress (please, do not bother about the warnings related to picoDST description files). When the compilation is done, analysis code is finally executed. You can verify the successfull execution by checking the content of OUTPUT_DIRECTORY - you should find there a ROOT file with analysis output.
 Regular code development/running
  1. Setup environment to stardev.
    stardev
  2. Enter the directory with UPC analysis code (MY_PATH is where you have offline/UPC directory).
    cd MY_PATH/offline/UPC
  3. Update the content of shared repository - this will ensure you are working with the latest version of all files in offline/UPC.
    cvs update
    
  4. Now you are free to work on analysis. You can change the analysis code (rpAnalysis.cpp, rpAnalysis.hh), edit the configuration file to run analysis code over various triggers, with different options etc., and launch analysis using runRpAnalysis script.
  5. NOTE: Use comments // or /* */ to describe part of the code you add, so that everybode can understand it.

  6. When you finish working with the code you should commit the changes you have made, so that all users are always working with same version of the software. It is important to always do a commit if change in the code has been made. Simply type
    cvs commit rpAnalysis.cpp rpAnalysis.hh
    NOTE: Before 'commit' always make sure that the code compiles and executes without errors! If the code doesn't work, but you would like to save all your work, you can easily comment lines in the code you have added, commit and work out the problem later.
    NOTE 2: Do not commit files other that rpAnalysis.cpp or rpAnalysis.hh. Especially important is to avoid committing configuration file, which is analyzer-specific.
    NOTE 3: CVS is "smart", so if somebody does a commit before you do, it can merge (typically with success) the changes in latest commited and your version of the file. If after doing 'cvs commit'  you receive a message similiar to
    cvs commit: Up-to-date check failed for `rpAnalysis.cpp'
    cvs [commit aborted]: correct above errors first!
    it means that described conflict has occured. In such case simply do
    cvs update
    If you don't get any warnings, you can re-commit  (first command in bullet #5). However, if you find a warnig like
    rcsmerge: warning: conflicts during merge
    cvs update: conflicts found in rpAnalysis.cpp
    you need to manually edit the file you want to commit. Click here to learn about the details.

If you find any problems (code does not compile or crashes at execution) and you suspect it is an issue of the code core you are kindly requested to report it to developers.



Configuration file (options)
Find below a list of options available in the configuration file. Obligatory options are TRIGGER, SINGLE_RUN, RUN_NUMBER (only if SINGLE_RUN=yes), DATA_DIRECTORY, CODE_DIRECTORY and OUTPUT_DIRECTORY.
If you think more options/utilities are needed in the configuration file, contact
developers.
  • TRIGGER
    This option is the name of the trigger that you want to analyze. It should have the same form as at http://online.star.bnl.gov/rp/pp200/.
  • SINGLE_RUN
    • If set to "yes" forces analysis of a single run (run number is defined by RUN_NUMBER option). In this case analysis is launched without STAR scheduler, using node you are currently logged on. Name of the output ROOT file in the OUTPUT_DIRECTORY has the following form: analysisOutput.RUN_NUMBER.TRIGGER.root.
    • If set to "no", full available dataset for a TRIGGER is analyzed using STAR Scheduler with job-splitting to multiple RACF nodes.
    NOTE: It is recommended to check validity of the code (compilation and execution with no errors) using SINGLE_RUN=yes before you run analysis over full dataset using STAR Scheduler with SINGLE_RUN set to "no".
    The submission XML file is automatically created by the launching script. Number of files for a single job is defined to 20 (can make it changeable if needed), so typically there are a few dozens of jobs submitted. This results in plenty of scheduler files showing up in CODE_DIRECTORY, as well as log/error files in OUTPUT_DIRECTORY. If you want to clean up CODE_DIRECTORY from the scheduler files, use clearSchedulerFiles.sh script. You can check progress of jobs execution with command
    condor_q -submitter $USER
    or, if you do not have any other jobs submitted, use
    condor_q -submitter $USER | tail -n1
    If the output is:
    0 jobs; 0 completed, 0 removed, 0 idle, 0 running, 0 held, 0 suspended
    it means that all jobs are finished. If all jobs were successfull, in your OUTPUT_SIRECTORY you should see a number of ROOT files called analysisOutput.SOME_LONG_NAME_WITH_VARIOUS_CHARACTERS.TRIGGER.root. Those are output files from each single job (SOME_LONG_NAME_WITH_VARIOUS_CHARACTERS is the ID of submission and ID of the job, separated by underscore "_"). To merge them into single file type
    hadd allRunsMerged.root analysisOutput.*
    This will create a single file called allRunsMerged.root. Remember to merge files only from one submission! If you suspect something went wrong during job execution you can check log and error files of each single job that are placed in OUTPUT_DIRECTORY and have extensions .log and .err, respectively.
  • RUN_NUMBER
    is the ID number of analyzed run (this option is omitted if SINGLE_RUN=no).
  • DATA_DIRECTORY
    Should contain full path to a directory where lists of available picoDST files are stored (same place as picoDSTs themselves). Currently it is /gpfs01/star/pwg/UPCdst.
  • CODE_DIRECTORY
    Should contain full path to a directory where your private copy of offline/UPC/ directory is placed.
  • OUTPUT_DIRECTORY
    Should contain full path to a directory where you want an analysis output (ROOT files, log files) to be saved. In case OUTPUT_DIRECTORY does not exist, it is created.
  • ANALYSIS_OPTIONS
    This option is intended to contain set of options separated by "|" character, that are send to analysis program and can be used, for example, to control which part of code should be executed etc..


Useful links
UPC analysis code repository in STAR CVS: http://www.star.bnl.gov/cgi-bin/protected/cvsweb.cgi/offline/UPC/
CVS tutorial @ drupal: https://drupal.star.bnl.gov/STAR/comp/sofi/tutorials/cvs
Presentation on the Krakow picoDST: https://drupal.star.bnl.gov/STAR/system/files/talk_42.pdf
StMuRpsCollection documentation (write-up): https://drupal.star.bnl.gov/STAR/system/files/RomanPotsInStEvent_0.pdf
StMuRpsCollection documentation (doxygen): http://www.star.bnl.gov/webdata/dox/html/classStMuRpsCollection.html
Roman Pot alignment description: to be added

Other databases created for the 2015 reconstructions


1.  Calibrations/pp2pp/pp2ppPMTSkewConstants


There are 64 Skew constants, as there are 4 constants for each PMT's and there are 2 PMT's for each of the 8 RP's.


Rafal's prescription:


Constants in set1.* contain parameters for runs 
16085056, 16085057 and >=16090042
Constants in set0.* contain parameters for all other runs.


Implementations:

1st set:

set0.*  entered at  "2015-01-01 00:00:01 GMT"


2nd set:

"RTS Stop Time" for 16085055 was "2015-03-26 23:05:04 GMT"
"RTS Start Time" for 16085056 was "2015-03-26 23:06:39 GMT"

set1.*  entered at  "2015-03-26 23:05:05 GMT"



3rd set:

"RTS Stop Time" for 16085057 was "2015-03-27 00:07:59 GMT"
"RTS Start Time" for 16085058 was " 2015-03-27 00:14:32 GMT"

set0.*  entered at  "2015-03-27 00:08:00 GMT"



4th set:

"RTS Stop Time" for 16090041 was "2015-03-31 22:38:57 GMT"
"RTS Start Time" for 16090042 was "2015-03-31 22:39:59 GMT"

set1.*  entered at  "2015-03-31 22:38:58 GMT"



2.  Geometry/pp2pp/pp2ppAcceleratorParameters



The order of entries in each set is:

x_IP  y_IP  z_IP  theta_x_tilt  theta_y_tilt  distancefromDX_east  distancefromDX_west LDX_east  LDX_west  bendingAngle_east  bendingAngle_west conversion_TAC_time

Entries entered :
"2015-01-01 00:00:01 GMT" :  0 0 0  0 0  9.8 9.8  3.7 3.7  0.018832292 0.018826657  1.8e-11

"
2015-04-28 00:00:01 GMT" :  0 0 0  -0.003640421 0 9.8 9.8 3.7 3.7  0.011238936  0.026444185 1.8e-11
( The pp run stopped on Apr. 27 and there were a few days before C-AD could switch from pp to pAu operationall
y.  I picked the beginning of Apr. 28 for this pAu entry.)

"
2015-06-08 15:00:00 GMT" :  0 0 0  -0.002945545 0 9.8 9.8 3.7 3.7  0.012693812  0.025021968 1.8e-11

( The last *pAu* run was 16159025 on June 8 and the first *pAl* run was 16159030 which was a bad run and started at "2015-06-08 15:44:13 GMT".  The previous run --- a pedestal run, 16159029, ended at "2015-06-08 14:24:54 GMT".  So I arbitrarily selected the above time kind of in the middle. )

pp2ppRPpositions (STAR offline database) corrections

Originally, this is mainly to correct for the malfunctioning LVDT readings of the E1D between Mar. 18 and Apr. 6, 2015.   I have come across 7 blocks/sub-periods where the E1D LVDT readings need to be corrected.  Since the steps are the same 505004 (with one exception below), I have used an average of the LVDT readings closest to the period of malfunctioning (usually the good ones before but if the last good readings was too long before, I have taken the averages of those good ones shortly after the period in question).   These are in the files ToCorrect1.txt, ToCorrect2.txt, .... ToCorrect7.txt.  The average position of each of this period that I have used is listed at the end of the file. [  ToCorrect1.txt has one entry which I have "bracketed" as it has to be corrected with the positions of ~-32 cm, instead of ~-25 cm, and this is explained in the 1st below. ]

In all cases, in the table of "Calibrations/pp2ppRPpositions", I have just inserted a set of 8 entries 1 second later in the "beginTime" than the original "beginTime" (the latter of which appears in all the .txt files listed here) as Dmitry Arkhipkin has instructed, even though there might be only one entry (E1D) that was changed.


However, I have come across the following and so I have also corrected them:
  1. For the run 16077055, it's ~32 cm or 449748 and so I have used the average of the last good LVDT readings corresponding to the same no. of steps (449748).  The file is "ToCorrect1.exception_32cm.txt".
     
  2. On Apr. 7, there was a period that the LVDT's of E1D and E2D were swapped by mistake.  I've corrected this as well.  The file for this is "E2D.txt".  I have needed to correct both E1D and E2D; and at the end of the file, the 1st position is for E1D and the 2nd one is for E2D.

  3. Accidentally, I've also come across a period (~6 pm on Apr. 3 to 10 am on Apr. 4) which had wrong entries because the online database was not updated (due to inaccessibility of CDEV/STAR portal server).   Dmitry Arkhipkin has scanned the entire online database and found 5 periods which has such gap of a period > 5 minutes, including the above-mentioned one.  I've checked that we only need to correct for 2, 4 runs in the above period (Apr. 4) and 6 runs on May 8 --- which was in the heavy-ion period where only the West side roman pots were actually inserted.  For the other 3, they are either not in the pp2pp (roman-pot) data-taking period or the positions (the steps) remained the same before and after the gap (so that the offline database just used the previous LVDT positions available in the online database).  The file for this is  "NotUpdated.Apr4_.txt" and "NotUpdated.May8_.txt" respectively for Apr. 4 and May 8.