QuickStart Instructions for the Auto QA Browser - Run 11

Under:
  • Go to the STAR Computing Offline QA home page on drupal (i.e. from the STAR Home Page select "Computing" in the left panel, then select "Offline QA" in the table row labelled "Production" or go directly to the Offline QA home page and open the Auto QA browser by clicking on the "Automated Offline QA Browser" button in the upper portion of the page. You may have to enter the STAR protected area username and password. Contact your local STAR council representative or me if you do not know it.
  • If the browser fails to open contact the QA Experts ASAP. If you cannot get to the Auto QA browser then you are S.O.L.
  • Enter your RCF login name.
  • Generally you should follow the menu on page 2. Buttons 1.1 and 2.1 direct you to pages where Real Data Production jobs and Fast Offline Production jobs can be selected. Note that for Run 11 the Offline QA shift crew are only responsible for the fast-offline production (button 2.1) and monitoring database migration (button 2.3).
  • At the beginning of your shift please check the status of the online-to-offline database migration and notify the QA experts and QA hypernews if the migration appears to be stalled.
  • For Real Data Production (Button 1.1) you will typically want to filter out previously reviewed jobs and select all other run numbers for production within the past 24 hours. Queries can be made based on QA status, run number or date, and software production version.
  • For the Fast Offline Production QA (Button 2.1) select the most recent run numbers which have not yet been QA reviewed. Other queries are available. You may examine the 24 TPC sectors or view the entire set of QA histograms by selecting the appropriate options in the upper portion of page 2.1. For Run 11 Au-Au one file sequence should suffice. But if you need greater statistics (e.g. for the p-p data) you may use the "Combine several jobs together" feature in order to have enough statistics in the histograms for proper evaluation. At a minimum please examine QA histograms for each trigger set for at least one file sequence per run number. Do more if you have time.
  • Beginning with Run 11 there are two additional options for selecting and reviewing QA jobs. These are noted by the buttons labelled "New: Select jobs that have been automatically combined" and the 3 job selection options listed as "TESTING" buttons.  The former combined jobs option may give a more complete list of available jobs. The latter "TESTING" options, as the name implies, are still under development. This new feature enables automated comparisons between the data and a reference and provides an easy way to visually compare data and reference.  You are encouraged to try this option and give feed-back to the QA team. For now this option can be used to easily compare the histograms to a reference and enables a convenient way to attach example histograms to QA issues. Hopefully before the end of Run 11 we can rely on the auto-comparison algorithms to check the data. For now, however, we must continue to rely on the shift crew's visual evaluation.
  • TESTING option - If you use this job selection and QA evaluation option you will be directed to a new web page which enables the histograms to be both visually and algorithmically compared to a reference (defined by the QA experts). In the upper panel check the run year, trigger and version options for the data and click either "arrow" button, then click the "Plots only" button to obtain individual gifs of each histogram (useful for QA issues documentation) or the "Analyze" button and wait a few minutes for the results. Failed histogram auto-comparisons are listed by default, but all histogram results may be selected in the left-hand panel; the results are color coded. Clicking the "Examine" buttons on the right displays the results and a visual comparison of the histograms with the reference. Selecting the histogram set in the left panel (e.g. general, minbias, high tower, etc.) displays a list of buttons for all the histograms. Clicking those buttons displays the data and reference. Thus there are two ways to display the selected data and reference. For now the list of failed histograms (too dissimilar to the reference) can be examined but please do not rely on the algorithmic comparisons just yet. Note that there are several HELP buttons which link to useful descriptions of the new features briefly discussed here. To return to the QA run selection page you must use the "Go back to QA selection" button in the upper panel.
  • Once you select a job to examine the QA shift or full set of QA histograms may be selected. Generally the smaller, "QA shift" set of histograms will suffice however you may need to examine the full set of QA plots in order to better diagnos a suspected problem. The postscript and/or pdf files are generated on the fly and displayed on your terminal. Generally it is best to have the QA shift report web form open in a different window so that you can fill it out as you check each set of histograms, job-by-job. Please follow the instructions on the QA shift web forms and supply all requested information about yourself and the jobs you have examined.
  • You may refer to the sample QA shift histograms from Run 10 (Au-Au) in the Description of Offline QA Shift Histograms for Run 10 until Run 11 Au-Au reference histograms are ready (now available via the TESTING option), usually about 2 weeks into the run. You should become familiar with the QA Shift plots and have some idea of what to expect before taking your shift.
  • Be sure to click the "MARK" button on the page for each job examined and reported. 
  • Please mark the job as "Good" or "Bad" on this same page. Normally jobs will be "good" but under extraordinary conditions the QA shift crew may mark jobs as bad. Please consult with the QA experts before marking jobs or runs as bad.
  • After completing all the listed jobs add whatever comments you think are useful and appropriate to the QA Shift Report. Be sure to include a useful summary for Fast Offline Data that will be helpful to the shift crew, i.e. report any changes from the previous day including new problems or resolution of old problems. Note that the QA Issues mechanism of the web based QA shift report form automatically monitors day-to-day changes in these issues and lists them in the QA shift report summary that is mailed to starqa-hn.
  • When new problems appear in the data please review the list of existing QA issues and use if appropriate before creating a new issue. Note that there is a key-word search tool to help you find previous, relevant issues. Please follow the naming convention established for the existing Run 11 issues.  You are encouraged to document the issues with histograms using the browse/upload tool in the QA issues editor web page. The "TESTING" option of the browser and the "Plots only" option on the new QA page provide an easy way to grab and upload individual histogram plots (gifs). Refer to the Help buttons on the new page and click "full topic list", then select "Grabbing a histogram image and attaching to an issue" for instructions - i.e. right click on the image, save to your computer, then in the QA issues page select "Image attachments" and upload your saved gif file.
  • MOST IMPORTANT!!! If you suspect any problem with the detector(s), calibrations, reconstruction or production you must contact the appropriate expert(s). This is the basic reason for having the Auto QA system and these dedicated shifts. The experts may be contacted via either the QA Experts or Other Experts web pages.
  • Complete your QA Shift Report and submit it. The ASCII text version will be emailed to 'starqa-hn'.
  • Links to QA documentation, contacts, the Rcas/LSF monitor, Online Run Log, and the QA shift report web form are available from Page 2.
  • Finally, you are done for the day; go get some rest!