computing

Run 7 BTOW Calibration

Under:

Guide to AFS and ACLs

Under:

Run VII preparation, meeting #15

Under:
-00-00
Thursday, 1 January 1970
, at 00:00 (GMT), duration : 00:00
TimeTalkPresenter
14:00Db migration from online to offline ( 00:10 ) 1 fileMike DePhillips (BNL)
14:10Elog changes and improvements ( 00:05 ) 0 filesLevente Hajdu (BNL)
14:15FastOffline status ( 00:05 ) 0 filesJ. Lauret (BNL)
14:20online/RTS CVS tree in AFS - feedback? ( 00:10 ) 0 filesAll (All)
14:30Catch up items ( 00:10 ) 0 filesAll (All)
14:40From run preparation to run support ( 00:10 ) 0 filesAll (All)

Embedding HF meeting #3

-00-00
Thursday, 1 January 1970
, at 00:00 (GMT), duration : 00:00

The Magnetic Monopole in STAR

Under:

Introduction

Run VII preparation, meeting #14

Under:
-00-00
Thursday, 1 January 1970
, at 00:00 (GMT), duration : 00:00
TimeTalkPresenter
14:00Status reports (see details) ( 00:20 ) 0 filesAll (All)
14:20nameserver issue online ( 00:10 ) 0 filesJeff / Wayne (BNL / BNL)
14:30Replacement for rts01 / rts02 ( 00:10 ) 0 filesJeff (BNL)
14:40Online QA ( 00:10 ) 0 filesFrank Laue (BNL)
14:50FastOffline & TriggerCount issues and needs ( 00:10 ) 0 filesJ. Lauret (BNL)

Embedding HF meeting #2

-00-00
Thursday, 1 January 1970
, at 00:00 (GMT), duration : 00:00
Requested: Olga, Manuel, Alex, Jamie, Maxim
Purpose: Review of all HF requests, current opened questions, priorities, policies, comunication channel understanding, opened questions.

Attendee: Jamie, Manuel, Maxim, Alex (Olga could not attend due topersonal matters)

The meeting started with a general procedure reminder as per the communication level between the embedding team and the PWGC /PWG. The general current (implicit) understanding is that the process will follow the steps as described:

  • A PA makes a request within the PWG, the PA is approved by the PWGC to move along with making a formal request
  • The Web interface is used to make a request. The PA is the contact person.
    • Note: The requests may be prioritized by the PAC or S&C leader depending on (respectively) physics priorities or technical feasibilities and optimization.
  • Upon submission, the request is inspected by the embedding team.
    • Precision may be requested to the PA
    • The request is modified until it become comprehensive
  • A sample is produced – QA is requested
  • Feedback is provided, the full request is scheduled

Several area may need reshape as per the procedure.
  1. The request page is ambiguous – an effort was started to reshape the interface (view [node:3206] for more information, tasks 49 to 51)
  2. The communication is ambiguous
    1. often one Email is sent and no reminders (easer side) and no pro-active checks for follow-ups.
    2. S&C meeting status overview is generally not attended by PWG representatives.

As follow-ups of this discussion:
  • The requirements for a new embedding request form will be sent to all PWG - We expect from the PWG a pro-active look at it and feedback what in the past was clear/not clear and what the requirements do address/do not address
  • We assume that the new interface will allow later to send automated reminders to the PA via cron. A db back-end will be needed (implementation detail).
  • We propose that the embedding coordinator would pro-actively send reminder (or delegate tasks to the embedding team consistent with moving the tasks along faster).
  • We will assume that the PWGC are prime responsible for ensuring the requests are answered from the PWG side.

We discussed the requests from the PWGC as per what is on [node:3987] (comment 127). We inspected the first 9 requests keeping in mind the suggestions made as comment 126. The following was agreed upon:
  • Request 7, 1154003633 is more trigger problematic and may need to be addressed at a later time embedding wise and will be put on hold.
  • Request 8, 1154004721 should have a related paper coming out soon. We will keep it in.
  • Requests 1 and 2 should be the highest priorities


Procedure and action items:
  • An embedding request is best submitted related to an ongoing/incomplete analysis. PWG should proceed with a simulation request (much simpler and not requiring another dimension of data/mixed with simulation and hence, confusion with chain, timestamp etc...) and then move to embedding for semi-final results.
  • The Vertex options VFMCE and VFFV aimed to consolidate the embedding framework.
    • VFFV: use a default vertex at (0,0,0) but is envisioned to be used to set the vertex to whatever appropriate value taken from an extrenal knowledge. Especially, the method StFixedVertexFinder::SetVertexPosition() - Possible use of this relates to the (hopefully) old way of doing embedding when an external file contains vertex information. This option could be used to set “a” vertex event by event.
      Method was implemented by Lee Barnby upon requirement request from the S&C project in May 2006.
    • VFMCE: This option was added to replace the StEvent vertex by the Monte-Carlo vertex. The implementation simply overwrites the vertex.
      Method was implemented by Jerome Lauret & Lee Barnby May 2006.
      Sti vertex constraint was fixed by Victor Perevoztchikov – Beware that the default vertex errors are 1 microns in all directions if set to null.

While we believe the vertex issue is very secondary to Xiaoyan's analysis, it was agreed that the vertex finder to be used is VFMCE, dropping any vertex from the data event and using the Monte Carlo vertex. This will be the case for all Cu+Cu and p+p embedding requests including 2006 data.
The code for FVMCE need backward propagation pending a bug tracking report to be submitted by Andrew Rose (chain and input file needed to debug).

Spin PWG Meeting (2/22/07)

Under:
Statistics for Late Longitudinal Running
  • ~ 400 runs (7132001 - 7156028)
  • ~ 6.2 million events
  • ~ 170K Neutral Pions for HTTP L2 Gamma trigger
  • ~ 80K Neutral Pions for HT2 Trigger
Trigger Ratios

BBC ADC Signatures of SpaceCharge azimuthal anisotropy

Under:

In my azimuthal sDCA studies, I observed static and dynamic azimuthal asymmetries in the CuCu200 data. One suggested possible way to avoid this in upcoming heavy ion runs is to see if there are any online scalers which could be used to give feedback to C-AD as to whether backgrounds at STAR are acceptable in terms of TPC ionization distortions. Towards this end, I have begun a study of whether the BBC scalers might be useful in this regard.

It is important to understand some things about the BBC geometry (see this postscript file). Ideally, the large outer tiles are closer to the TPC and more likely represent information relevant to what is happening in the TPC. We would most likely want to implement scalers for these in an upcoming heavy ion run. Unfortunately, in the past we have only taken ADC data with these large tiles. I have already done a study using scaler information from the small BBC tiles which demonstrates some fluctuations at similar azimuth to where the TPC shows problems.

For this study, I use only the outer ring of large BBC tiles on the east and west ends separately. These twelve tiles (numbered 25-36) are covered by 4 PMTs (numbered 21-24), which integrate over 3 tiles each, spanning nearly equivalent ranges of azimuth. Also, to avoid biasing the ADC contributions towards those from triggered nuclear collisions, only zerobias events are useful for such analysis.

I have arranged the following plots similarly to my azimuthal sDCA studies. I define NADC (normalized ADC values) to be the rate of a given PMT's ADC divided by the sum of all 8 large tile PMT ADCs. This is done simply to normalize against overall changes in gains.

In the plots below, the first row shows the <NADC> (mean normalized rate) value for each phi bin. In the subsequent rows, I plot NADC-<NADC> versus day (with each phi bin offset) to see what's happening on sub-day time scales, and versus day in a colored 2D graph to see what's happening on larger time scales. There are three sets of plots: the first is all the CuCu200 BBC FF ADC data I have for zerobias triggers; the second is the CuCu200 BBC RFF ADC data I have between days 20 and 46 for all triggers (zerobias trigger data only seems to be available after day 32); the third is RFF zerobias data only for days 37-44 (runs 6037039-6044001) to provide more direct comparison with the CuCu200 data in the sDCA studies.

Additional comments below the plots.

CuCu200 FF

Zerobias triggers.

 

2005 CuCu 200 FF Large outer tiles east Large outer tiles west
<NADC>
NADC - <NADC> vs. day
NADC - <NADC> vs. day

 

CuCu200 RFF

Zerobias triggers only started in this data on day 32, so this is ALL TRIGGERS starting on day 20.

 

2005 CuCu 200 RFF Large outer tiles east Large outer tiles west
<NADC>
NADC - <NADC> vs. day
NADC - <NADC> vs. day

Just a small note as to why I started with day 20 (even though there is earlier data): there seemed to be some notable changes in running conditions and/or gains during the period prior to day 20, as can be seen in these variance charts:

2005 CuCu 200 RFF Large outer tiles east
NADC - <NADC> vs. day

 

CuCu200 (days37-44)

Zerobias data, RFF.

 

days 37-44 Large outer tiles east Large outer tiles west
<NADC>
NADC - <NADC> vs. day
NADC - <NADC> vs. day

While there are some faint signs of fluctuations in the azimuthal BBC large tile ADC data, these fluctuations seem to be rather small. My conclusion at this point is that the BBC ADC data is not as sensitive to the problematic backgrounds as the BBC scaler data, despite being at larger radius.

 


Gene Van Buren

Embedding HF meeting #1

-00-00
Thursday, 1 January 1970
, at 00:00 (GMT), duration : 00:00
Requested: Olga & Andrew
Purpose
: Technical and Historical background & status update


Olga provided a list of all embedding requests and their status. We discussed especially the requests 1154004033, 1154004074, 1166698516 and 1166698601 (highlighted). The first two were discussed in August 2006 via Email as well as diverse lists and fora. For historical completeness, it was mentioned that neither of those requests were completely defined before a much later time. The first two were not until October 18th when the meaning of "good B-EMC data" was explained and sorted out. The same applies for the later two requests which were sorted out (same issue) on the 7th of January 2007.


Action item:
  • Andrew will send Jerome the side related Emails in regard of feedback gathering and the run selection provided by the PAs.

     

 

On the technical/physics detail: there are no apparent technical detail at this stage, the code seems to be running. The default vertex finder (Minuit improved) relying on the B-EMC would likely be problematic and in fact, no vertex are currently reconstructed. This may be due to many issues (would need to trace and debug). The following options to be used were suggested as either:

  1. VFFV Fixed vertex finder (0,0 TBC)
  2. VFMCE Vertex finder used as within McEvent

The choice of one or the other depends on Physics objectives. Queried mid-2006, it was unclear what was the PA's perception nor needs related to the vertex.


Action items:
  • Andrew will check the VF options and report (one job each or 10 events)
  • The choice of VF will be sorted out the PWGC and the PAC on Thursday.

 

Some issues and concerns have been emitted as per the meaningfulnesses of the Upsilon embedding. The low number of upsilon and an analysis dominated by PID cuts and other analysis related error bars may indicate that a full embedding scheme could be an over-kill. Also, it is understood that the current

steps taken by the PA (going for pure simulation run for electron request) is out of 'alternate choice' rather than 'preferred choice' and it is agreed we would need to actively understand the request and its intent once again (and certainly "fully" understand it this time).


Action item:
  • Will discuss this issues with PWGC on Thursday

The issue of priority relevance was raised. Shall the priorities remain as they are rather than being revisited for "post QM" reality?


Action item:
  • Jerome will get back to Jamie and/or Alex & Manuel on this issue and get a new priority ordering out
  • Jerome will set the meeting with the PWGC, PAC and Olga + Maxim if needed. It appears to first order that the requests would be back to an embedding request apart from the Upsilon to be discussed.

Here is the details of Heavy Flavor requests:

First, the easy ones:
1121704015 J/Psi Au+Au
1112996520 pi0 62 GeV Au+Au
1112996461 e 62 GeV Au+Au
all closed be convenor's request

1154003633 J/Psi embedding for pp2006
1154003721 Upsilon embedding for pp2006
have not tried yet, can't overcome pp2005 troubles

1154003879 electron embedding for Cu+Cu 2005
test sample done, QA done, production started

1154003931 pi0 embedding for Cu+Cu 2005
not yet done, but should not be a problem

1154003958 gamma embedding for Cu+Cu 2005
not yet done, but should not be a problem

1154004033 electron embedding for p+p 2005
1154004074 pi0 embedding for p+p 2005
those two have been attempted many times, failed for various reasons. The latest one - no primary vertex found in reconstruction


1166698660 J/Psi embedding in Au+Au with shifted vertex
Sample produced, never heard from PAs (done in a rush before QM), no QA

1166698180 Ds in d+Au
This request was turned into embedding just last week or so (was pure simulation before).


1166698516 J/Psi embedding in Cu+Cu
1166698601 Upsilon embedding in Cu+Cu events
those two could be done, but would be good to get some input back on QM J/psi sample.

1093385624 electron embedding for Au+Au 62 GeV
not done yet. ... could be very well approximated by other date available.


Modified Birmingham Files

Under:
Upload of modified embedding infrastructure files used on Birmingham NP cluster for Cu+Cu for (anti-)Λ and K0S embedding request.

P04if

Under:
    Hit level check-up:
  • Missing/Dead Areas (PiMinus): The next graphs show dead sectors for embeded data and real data as well.
  • Hits-P04if-PiMinus_hitsXYeast_p2.gif

    QA Documentation

    New embedding Base QA instructions

    Run VII preparation, meeting #13

    Under:
    -00-00
    Thursday, 1 January 1970
    , at 00:00 (GMT), duration : 00:00
    TimeTalkPresenter
    14:00Update on the Electronic ShiftLog ( 00:10 ) 0 filesLevente Hajdu (BNL)
    14:10Update on the new online WebServer readiness ( 00:10 ) 0 filesMike DePhillips (BNL)
    14:20Path for Ganglia online ( 00:10 ) 0 filesAll (All)
    14:30DAQ CVS in AFS ( 00:10 ) 0 filesJeffery Landgraf / Jerome Lauret (BNL / BNL)
    14:40RealTime EventDisplay ( 00:10 ) 0 filesValeri Fine (BNL)
    14:50AOB ( 00:10 ) 0 filesAll (All)

    Production Management

    Under:
    1) Usually embedding jobs are run in "HPSS" mode so the files end up in HPSS (via FTP). To transfer them from HPSS to disk copy the perl script ~starofl/hjort/getEmbed.pl and modify as needed. This script does at least two things that are not possible with, e.g., a command line hsi command: it only gets the files needed (usually the .geant and .event files) and it changes the permissions after the transfers. Note that if you do the transfers shortly after running the jobs the files will probably still be on the HPSS disk cache and transfers will be much fast than getting the files from tapes.

    Running Embedding

    Under:
    This page describes how to run embedding jobs once the daq files and tags files are in place (see other page about embedding production setup).

    Basics:

    Embedding code is located in production specific directories: ~starofl/embedding/P0xxx. The basic job submission template is typically called submit.starofl.pl in that directory.

    Embedding Production Setup

    Under:
    This page describes how to set up embedding production. This procedure needs to be followed for any set of daq files/production version that requires embedding. Since this typically involves hacking the reconstruction chain, it is not advised that the typical STAR PA attempt this step. Please coordinate with a local embedding coordinator, and the overal Embedding coordinator (Olga).

    Plot_Nfit.C

    Under:
    #ifndef __CINT__ #include "TROOT.h" #include "TSystem.h" #include #include "TH1.h" #include "TH2.h" #include "TH3.h" #include "TFile.h" #include "TTree.h" #include "TChain.h"

    Plot_Dca.C

    Under:
    #ifndef __CINT__ #include "TROOT.h" #include "TSystem.h" #include #include "TH1.h" #include "TH2.h" #include "TH3.h" #include "TFile.h" #include "TTree.h" #include "TChain.h"

    scan_embed.C

    Under:
    //v 1.7 2007/05/21 23:16:14 // owner: Cristina #ifndef __CINT__ #include "TROOT.h" #include "TSystem.h" #include #include "TH1.h" #include "TH2.h"