Spectra Physics in the DAQ 1000 Era

Speaker : Nu Xu ( LBL )


Talk time : 11:50, Duration : 00:30

High Pt Physics in the DAQ 1000 Era

Speaker : M. van Leeuwen ( LBL )


Talk time : 11:20, Duration : 00:30

Software and Computing Capabilities and Issues

Speaker : Jerome Lauret ( BNL )


Talk time : 10:35, Duration : 00:45

Coffee

DAQ1000 capabilities

Speaker : Tonko Ljubicic ( BNL )


Talk time : 09:15, Duration : 00:30

Welcome

Speaker : Tim Hallman ( BNL )


Talk time : 09:00, Duration : 00:15

Part II

TimeTalkPresenter
09:00New QCD Frontier for RHIC -- an exotic particle factory ( 00:30 ) 1 fileHuan Huang (UCLA)
09:30Gamma-Gamma HBT and DAQ1000 ( 00:30 ) 1 fileJack Sandweiss (Yale)
10:00Physics with tagged forward protons in DAQ 1000 era ( 00:20 ) 2 filesWlodek Guryn (BNL)
10:20
Coffee ( 00:20 )
 
10:40Using the HFT as a trigger device ( 00:30 ) 1 fileHoward Matis (LBL)
11:10Some facts and thoughts on a fast online tracking ( 00:30 ) 1 fileAihong Tang (BNL)
11:40
Lunch ( 01:50 )
 
13:30Summary of Physics Data Sets ( 00:30 ) 0 filesDick Majka and Zhangbu Xu (BNL)
14:00Brainstorming, Discussion to identify key points and especially action items, generate bullet points for close out report ( 02:00 ) 0 filesAll (BNL)
16:00
Adjourn ( 00:20 )
 

DAQ1000 Capabilities and PWGC Prospects

TimeTalkPresenter
09:00Welcome ( 00:15 ) 1 fileTim Hallman (BNL)
09:15DAQ1000 capabilities ( 00:30 ) 1 fileTonko Ljubicic (BNL)
09:45Trigger in DAQ1000 ( 00:30 ) 1 fileHank Crawford (UCB/SSL)
09:45 ( 00:00 ) 0 files (UCB/SSL)
10:15
Coffee ( 00:20 )
 
10:35Software and Computing Capabilities and Issues ( 00:45 ) 1 fileJerome Lauret (BNL)
11:20High Pt Physics in the DAQ 1000 Era ( 00:30 ) 1 fileM. van Leeuwen (LBL)
11:50Spectra Physics in the DAQ 1000 Era ( 00:30 ) 1 fileNu Xu (LBL)
12:20
Lunch ( 01:40 )
 
14:00UPC Physics in the DAQ 1000 Era ( 00:30 ) 1 fileBoris Grube (Pusan)
14:30Heavy Flavor Physics in the DAQ 1000 Era ( 00:30 ) 1 fileManuel Calderon (UCD)
15:00E by E Physics in the DAQ 1000 Era ( 00:30 ) 1 fileAihong Tang (BNL)
15:30
Coffee ( 00:20 )
 
15:50Spin Physics in the DAQ 1000 Era ( 00:40 ) 1 fileJim Sowinski (IUCF)
16:30Estruct Physics in the DAQ 1000 Era ( 00:20 ) 1 fileLanny Ray (phone) (UT)
16:50Open ( 00:30 ) 0 files (UT)
17:20Strangeness Physics in the DAQ 1000 Era ( 00:30 ) 2 filesMatt Lamont (BNL)
17:50
Adjourn ( 00:40 )
 

Physics in the DAQ 1000 Era

2007-08-16 09:00
2007-08-17 15:30
Etc/GMT-5
-
from Thursday, 16 August 2007 to Friday, 17 August 2007
Physics seminar room
Conference duration: 2 days

The goal of the workshop is to re-examine and assemble the key physics STAR will be addressing over the next eight years in light of the greatly increased DAQ rate capability and increas

Beampipe support geometry and other news

Under:

Documentation for the beampipe support geometry description development

L2 status tables

Instead of producing lots of 2007EmcMbStatus runs to generate pre-production status tables, I thought we should look into using the compressed tower spectra Jan saved in the l2ped monitoring program.  I wrote a script to regenerate histograms from these ASCII lines and then asked Matt Cervantes to run the CSMStatusUtils code on them.  I also asked him to run the status code on standard histograms produced by analyzing MuDsts for a single test run (8141062).  I compiled some stats on the differences between the  offline status and the l2 status:

offline == 1, L2 != 1:  76 towers
offline != 1, L2 == 1:  8 towers
Both bad, but different reasons:  40 towers

Some comments:

  • L2 status had trouble catching stuck bits (220, 1143, 1612, 2188) as well as recognizing cold towers (187, 4595).  These two scenarios accounted for pretty much all of the cases where L2 marked a tower good and offline didn't.
  • L2 marked a bunch of towers with high pedestals as "cold", since there are zero counts above 60.  Most of the differences in Case 1 are due to this problem.
  • Generally very good agreement -- less than 2% of towers were different if all you care about is status==1.

We'd like to tweak things a little to see if we can catch the few differences we have.  In particular, marking all those towers as cold could hurt the vertex-finding efficiency a little bit (that's all we really care about in this pass). 

Detailed status codes and histograms are available at the bottom of the post.

Xgrid jobmanager status report

  • xgrid.pm can submit and cancel jobs successfully, haven't tested "poll" since the server is running WS-GRAM.
  • Xgrid SEG module monitors jobs successfully.  Current version of Xgrid logs directly to /var/log/system.log (only readable by admin group), so there's a permissions issue to resolve there.  My understanding is that the SEG module can run with elevated permissions if needed, but at the moment I'm using ACLs to explicitly allow user "globus" to read the system.log.  Unfortunately the ACLs get reset when the logs are rotated nightly.
  • CVS is up-to-date, but I can't promise that all of the Globus packaging stuff actually works.  I ended up installing both Perl module and the C library into my Globus installation by hand.
  • Current test environment uses SimpleCA, but I've applied for a server certificate at pki1.doegrids.org as part of the STAR VO.

Important Outstanding Issues

  • streaming stdout/stderr and stagingOut files is a little tricky.  Xgrid requires an explicit call to "xgrid -job results", otherwise it  just keeps all job info in the controller DB.  I haven't yet figured out where to inject this system call in the WS-GRAM job life cycle, so I'm asking for help on gram-dev@globus.org.
  • Need to decide how to do authentication.  Xgrid offers two options on the extreme ends of the spectrum.  On the one hand we can use a common password for all users, and on the other hand we can use K5 tickets.  Submitting a job using WS-GRAM involves a roundtrip user account -> container account -> user account via sudo, and I don't know how to forward a TGT for the user account through all of that.  I looked around and saw a "pkinit" effort that promised to do passwordless generation of TGTs from grid certs, but it doesn't seem like it's quite ready for primetime.

Hit density in FGT region

I am using FTPC hits to study the hit density in the forward region. For this I use files from run 7145009, where the BBC coincidence rates were around 500 kHz.

Cluster Finding

Speaker : Vi Nham


Talk time : 00:20, Duration : 00:10

Software Issues


Talk time : 00:59, Duration : 00:15

RDO Testing

Speaker : Michael


Talk time : 00:54, Duration : 00:05

Future meeting schedule in August and September

Speaker : Howard


Talk time : 00:49, Duration : 00:05

Meeting with Tim on the Future of the SSD

Speaker : Howard


Talk time : 00:39, Duration : 00:10

Web Documentation of SSD Removal Status

Speaker : Artemious


Talk time : 00:30, Duration : 00:09

See http://www.star.bnl.gov/STAR/ssd/pictures/removal2007/