FMS database overview

The overall STAR database infrastructure is described in detail in the pages linking from the database homepage, so I won't describe it further here. I would recommend reading through the documentation to understand how the database operates. In particular, you should understand how timestamps work, and the difference between ofl(ine) and simu(lation) "flavours".

FMS database table structures

We have ten different database tables for the FMS, divided into "geometry" and "calibrations" tables:
  1. fmsChannelGeometry: number of channels in each detector subsystem
  2. fmsDetectorPosition: x/y placement of each subsystem
  3. mapping/fmsPatchPanelMap: patch panel mapping of each detector channel
  4. mapping/fmsQTMap: QT-to-patch-panel mapping
  5. mapping/fmsMap: detector channel to QT (crate/slot/channel) mapping
  6. fmsGain: channel-by-channel nominal gains
  7. fmsGainCorr: channel-by-channel gain correction factors
  8. fmsLed: channel-by-channel LED values
  9. fmsLedRed: channel-by-channel LED values for reference period
  10. fmsPi0Mass: channel-by-channel pi0-mass-based correction factor
The structures storing the data in each table can be viewed from the database browser. For the calibrations tables, click
CALIBRATIONS -> FMS -> fms : reconV0 (on left-hand menu) -> <table name>
and for the geometry tables do the same, substituting "Geometry" for "Calibrations". Note that tables 3-5 are in a sub-menu that drops down when clicked. The page shows what data is stored in the table - the types, names and meanings of each field. It gives the IDL descriptor for the table, and the equivalent C++, along with the locations of the corresponding source files. It also shows an example macro for reading from the database, and for writing to it. These are useful as an introduction to how to interact with the database in general, as well as for the specifics of the table in question. If you have questions, Dmitry is always very responsive.

Note that some tables are indexed, meaning that there are multiple structures (essentially "rows" in the table) for a given timestamp. Examples are the fmsGain database, where there is a sepearate "row" in the table for each channel at a given timestamp, or fmsDetectorPosition, where there is one "row" for each sub-detector. (In fact, most of our tables are indexed.) An example of a non-indexed table is fmsLed, where there is only a single "row" for a timestamp - the different values for each channel are stored in arrays in the (single) database structure.

Reading from and writing to the database

Fortunately or unfortunately, depending on your perspective, there are a number of ways of reading from and writing to the database. Note that the StFmsDbMaker code is designed for end-users to read values from the database in their makers, but isn't used for actual uploading of values to the database.

Method 1

There are macros for uploading and reading the tables as part of the StFmsDbMaker distribution, though they don't use StFmsDbMaker itself. They can be found in the macros/ subdirectory. These read in a text file, parse the information appropriately and upload it to the database, or do the reverse (read the database and output the text file). The easiest way to see the text format for each is just to run the macro in the "read DB" mode. I believe these were written by Jingguo, and while seem to work as expected, they are not always very clearly written or commented, and the timestamps at which to read/write are hard-coded, so you have to edit them. Note that there are not currently macros there for uploading the LED and pi0 mass tables.

Method 2

I wrote my own class to handle FMS database interaction. It is not currently part of an official distribution, but you can find the code here:
/star/u/tpb/StRoot/StFmsDbUtilities
This code is based on Jingguo's macros in StFmsDbMaker, but is compiled (so it should be safer) and supports the LED and pi0 mass tables. There is an example script for using it here:
/star/u/tpb/fmsSoftware/fmsDbUtilities.cpp
You can use the "run()" function in this script to read and/or write between the database and a text file for whatever timestamp and flavour you like. Note that the format for the text files is the same as in the StFmsDbMaker macros. This would be my preferred method going forward, rather that using the StFmsDbMaker macros. The one caveat is that is hasn't been reviewed by anyone else than me, so I won't guarantee it to be bug free! However, if we at some point manage to create an "StFmsPool" software pool like they have for spin, this would be a good addition.

Method 3

As described above, the database structure viewer gives an example read and write macro for each table. Therefore if you decide you don't like methods 1 and 2, you can use the examples as a basis for you own. However, note that the DB viewer examples are just skeleton macros that don't actually read or write the data for the particular structure.

However, one thing you may want to use even if you go with method 1 or 2 is a "reupload" script that Dmitry sent me privately. This just copies an entire table at some timestamp to another timestamp, with no modification. This is useful e.g. if you want to propagate information that doesn't change year-to-year (e.g. geometry), or to put in dummy values at the start of the year when initialising the database for a run (see below). The reupload macro can be found here:
/star/u/tpb/fmsSoftware/table_reupload.C

Database initialisation

Towards the end of the year you will likely get an email from Jerome talking about "timeline initialisation". This refers to inputting initial values for all tables in the database at the official start of a STAR run, typically in December. Even if a table doesn't change from one year to another, they like it to be uploaded afresh at the start-of-run timestamp. Note also that the start-of-run date is typically different for "offline" an "simulation" database tables - normally about 10th December for simulation, and 20th for offline. Therefore you will normally end up making an upload twice for each table - one offline, one simulation. The offline and simulation values may be the same - e.g. for detector geometry - or they may differ - e.g. for gains, the offline values may be a "best guess", such as the previous year's average values, whereas the simulation entry would be "ideal" gains.

I placed some notes I made to myself about a previous year's initialisation here. Hopefully that page and the links from there will be of help.