Running the Jet Code

The objective of this blog is to document and describe how to perform/run the inclusive and dijet cross-section code in the STAR Framework. The dataset used/described in this blog is the Run 9 pp 500 GeV data, but may be applied to any other data set. I will try to cover all the relevant pieces of code, the location, how to execute the code, and the expected output. 

OUTLINE: 

    Creating Jet Trees:
         - Data
         - Embedding
         - Underlying Event

    Jet Quality Analysis 

    Creating Dijet Trees

    Reading Dijet Trees

    RooUnfold and Calculating Cross Sections

    Embedding 
 
    Inclusive Jet Tree

    Underlying Event

Creating Jet Trees:
 
First, a couple of points, the current code in the STAR Framework has been designed to produce *.jet.root and .*skim.root and *.ue.root trees/files. It can run mulitple jet finding algorithms, stores all pertinent information about the jets produced in every event, store the trigger and spin-bit information, implement underlying event calculations, etc. It is important to note, this code has been stored on the CVS Repository. While you can pull down the current version of the code and make changes to it for your own analysis, if you significantly change the code you will need to commit it to CVS. The changes you implement to the code should follow the STAR guidelines and be properly documented. 

    The Code:  

        The majority of the code for running the jet finding alogrithm and creating the trees are located at: StRoot/StJetMaker, StRoot/StJetFinder, and StRoot/StSpinPool. Any significant changes to this code should be reviewed and committed. It is stongly encouraged that you examine the details of the code to understand the STAR framework and to better grasp the concepts of a jet analysis. 

        The main macros to execute and create jet, skim and ue trees can be found at StRoot/StJetMaker/macros/ .  These macro will use the classes and code listed above. I find it very helpful and benefical to try to understand line by line the code in these macros. Now, you will find many macros in this directory, which were used to run the JetMaker over the years. The most recent and most beneficial macros are RunJetFinder2009_ue.C and  RunJetFinder2012pro.C.  These macros take as input the MuDst files (very large in disk space) and produce the jet, skim, and ue trees.

      You should be able to pull down these macros from CVS into an empty directory, and in the command line execute the following:

      root4star -b -q RunJetFinder2012pro.C > logfilename & 
 

and after completion it should produce a jet, skim, and ue root files of interest. Check your input and output file directories and names as this should/will change. A couple items to note, depending on the data of interest, the trigger IDs will be different and you will need to edit lines like "filterMaker->addTrigger(380401)".  This number would need to be changed. Also, look over the different analysis parameters, which you can see in the declaration of the StAnaPars class and the subsquent pointers. This will allow you to tweak the jet, track and tower cuts to suit your analysis needs. This will be the code you will edit the most when you first start your jet analysis. So play around, do some tests and figure things out. 

       Word of advice: The jet trees are still very large in size!! When doing any test on the jet trees, make one (not more than that) change to the code, and then run over a small sample, and observe the effects. 

       Running over all the data: In order make many jet trees you will want to use the STAR scheduler, which has it's own quirks and documentation.  The scheduler will divide your submission and spread the jobs across mulitple nodes on the RHIC Computing Facility. Ultimately, proper use of the scheduler will reduce the computing time, so it is worth knowing how it works. Note: Make sure you have all the parameters and different branches within the jet trees established before processing the entirity of your data. Jet Trees are not small!

       I am pointing you to an example xml file that runs the scheduler and submits jobs. There many different options and feel free to use mine as an example. You can find it at the following: /star/u/gdwebb/UE_CVS_Check/Run500GeVJetProduction.xml. This xml files takes as input the parameter "&runnumer;". And to submit the jobs I used the following bash script:  /star/u/gdwebb/UE_CVS_Check/SubmitJetMaker2009pp500.sh. This script loops over the runnumbers in your dataset and you'll need a list of runs that you want to produce.  Look at this code and make sure to understand how it works. At the end, once you have a list of good runs from your data, you will simply execute the following command to start the process:

./SubmitJetMaker2009pp500.sh 

      Make sure you have 1) the disk space to store the jets 2) the correct input and output directories and 3) make sure there is actual information in the trees (not an empty file). 

A similar process will be needed when you produce jet and ue trees from embedded Monte Carlo simulations ("Embedding" for short). The following code is useful for running over embedding to make jet trees: 

/star/u/gdwebb/UE_CVS_Check/RunJetFinder2009emb_ue.C <- macro to make jet trees from embedding
/star/u/gdwebb/UE_CVS_Check/RunJetFinder2009fzd_ue.C 
/star/u/gdwebb/UE_CVS_Check/RunPythia500GeVJetProduction.xml <- xml file to submit jobs to condor
/star/u/gdwebb/UE_CVS_Check/RunMC500GeVJetProduction.xml 
/star/u/gdwebb/UE_CVS_Check/SubmitEmbedJetMaker2009pp500.sh <- script to run over all runnumnbers
/star/u/gdwebb/UE_CVS_Check/SubmitPythiaJetMaker2009pp500.sh

Jet Quality Analysis:

When you have all the parameters sorted out for you analysis, you will want to perform a Jet Quality Analysis (QA) on both the Data and Embedding jet trees. I am now going to veer away from code stored on CVS. The QA code I use is located at

/star/u/gdwebb/JetQA500/

Again within this directory you will see a directory called macros and scripts. A common theme/format I implement in most of the code I write. Here are the following pieces of code you will need to run a Jet Quality Analysis. 

The scripts to execute over data and emedding: 

/star/u/gdwebb/scripts/SubmitJetQA2009pp500.sh 
/star/u/gdwebb/scripts/SubmitEmbedJetQA2009pp500AllpT.sh 

which in turn submits jobs to the scheduler using the xml files below: 

/star/u/gdwebb/scripts/Run500GeVJetQA.xml
/star/u/gdwebb/scripts/Run500GeVEmbedJetQA.xml

and these will call the following macros: 

/star/u/gdwebb/macros/Run500GeVQA.C
/star/u/gdwebb/macros/RunEmbed500GeV.C

Look inside these macros and scripts. Make sure all the directory and path names are properly changed for your own analysis. This code will create histograms which can then be easliy used to make plots and explore the data.  Also, within the macros directory you will find many files starting with "Make..".C  These are macro that will produce plots to visualize the data and verifiy if the jets are behaving properly. Typically, these plotting macros will take as input data and/or embedding files that have been hadd'ed together (one for the data file and one for each partonic pT bin in embedding files). Look at 
/star/u/gdwebb/macros/MakeEmbedDataJetQA500GeVPlots.C as an example and understand how it works. 

Note: This is personal code and it is highly recommended that you create/write your own version of a Quality Analysis. However, using the code for Run 9 pp 500 will help as a baseline for your analysis.

Here you can find out how the MC and Data Jet QA for Run 9 pp500. 

Creating Dijet Trees

As mentioned earlier, jet trees are still very large in size. Therefore, I created the dijet trees to make it more managable. And you can find the code on how to create dijet trees at the following: 

/star/u/gdwebb/DijetCode/

and you will see a similar scripts and macros directory there. However, there is also compiled code, which implements the StDiJetMaker Class and you can find it at the following:

/star/u/gdwebb/DijetCode/StRoot/StDiJetMaker/

this code takes as input the jet and skim trees and outputs dijet trees. Walk through this code and try to understand how it works. Here are the scripts of interest to execute: 

/star/u/gdwebb/DijetCode/scripts/SubmitDijets2009pp500.sh
/star/u/gdwebb/DijetCode/scripts/SubmitAllpTbinsEmbedDijets200pp500.sh
/star/u/gdwebb/DijetCode/scripts/SubmitPartonDijets2009pp500.sh
/star/u/gdwebb/DijetCode/scripts/SubmitPythiaDijet2009pp500.sh

and the respective xml files:

/star/u/gdwebb/DijetCode/scripts/RunDiJetAnalysis.xml
/star/u/gdwebb/DijetCode/scripts/RunEmbedDijetAnalysis.xml
/star/u/gdwebb/DijetCode/scripts/RunPartonDiJetAnalysis.xml
/star/u/gdwebb/DijetCode/scripts/RunPythiaDiJetAnalysis.xml

and the macros: 

/star/u/gdwebb/DijetCode/macros/RunDijetCode.C
/star/u/gdwebb/DijetCode/macros/RunEmbedDijetCode.C
/star/u/gdwebb/DijetCode/macros/RunPythiaDijetCode.C
/star/u/gdwebb/DijetCode/macros/RunPartonDijetCode.C

With the scripts, xml files, macros, and StRoot/StDijetMakerCode list above you should be able to produce dijet trees from the data and embedding. I am also providing code that will produce jets using embedding at the Detector level, particle (pythia) level, and parton level. At the end of the day, you will need all levels. The dijet code does not implement any cuts on the jets, it only selects out the highest two pT jets and stores the useful information about these two jets. 

     Word of Advice: Do not expect code to always immediately work right out of the box. In fact, expect the opposite. New code integrated into the system may conflict with old code or there may a bug or you may be executing something incorrectly. Read the error message you observe when running or compiling code, try to understand it and see if you can resolve it. Then ask for help. 

Reading Dijet Trees

Now that we have the much smaller dijet trees, we need to read them, access useful infomation from them, and store in nice histograms and make pretty plots. Making and interpreting plots are my favorite part of an analysis so you should be very happy and proud to have reached this stage. Also, the code is relatively easy and simple to run as follows:

scripts:
/star/u/gdwebb/DijetCode/SubmitReadDataDijetsAllpT.sh
/star/u/gdwebb/DijetCode/SubmitReadDijetsAllpT.sh
/star/u/gdwebb/DijetCode/SubmitReadPythiaDijetsAllpT.sh
/star/u/gdwebb/DijetCode/SubmitReadPartonDataDijetsAllpT.sh

macros:
/star/u/gdwebb/DijetCode/macros/ReadDijetTrees.C
/star/u/gdwebb/DijetCode/macros/ReadEmbedDijetTrees.C
/star/u/gdwebb/DijetCode/macros/ReadPythiaDijetTrees.C
/star/u/gdwebb/DijetCode/macros/ReadPartonDijetTrees.C

The scripts above will directly run the macros and produce histogram root files. The macors also implement any cuts you may want to apply for a cross section analysis. Therefore, look over the macros and see what is pertinent for your analysis. The code will run relatively quick and will make one root file per dijet tree.

Once you have all the histogram files, you will then want to hadd them together to have one root/histogram file for data and one root/histogram file per partonic pT bin in the embedding. This will give you all the needed information to calculate a cross section using the RooUnfold package. You can find an example of some of the final files I used at the following:

/star/u/gdwebb/DijetCode/radius06/

Before we get into unfolding and cross section calculations, it is important to make sure everything with the dijet trees are behaving as expected. Therefore, you will want to examine the dijet trees and I have listed some macros that may be helpful to explore the dijet data files. I.e Make data and embedding comparisons: 

/star/u/gdwebb/DijetCode/macros/CompareDetectorParticleLevel.C
/star/u/gdwebb/DijetCode/macros/CompareDetectorPartonLevel.C
/star/u/gdwebb/DijetCode/macros/ComparePythiaPartonLevel.C
/star/u/gdwebb/DijetCode/macros/CompareSystematics.C
/star/u/gdwebb/DijetCode/macros/MakeCompareEmbedDataPlots.C
/star/u/gdwebb/DijetCode/macros/MakeDataOnlyPlots.C
/star/u/gdwebb/DijetCode/macros/etc. 

Here you can find a nice blog page to describe the output of many of these plotting macros.

RooUnfold:

You can find the documentation of the RooUnfold package here. It is beneficial to understand the code in this package and understand how it works. It is able to perform multiple unfolding techniques from the simple bin-by-bin to Singular Value Decomposition (SVD) unfolding. As an example, I'm pointing you to my implementation of the RooUnfold package for the pp 500 GeV dataset. 

cd ~gdwebb/DijetCode/RooUnfold/

to compile the code, which you should get from the link above (see the README).

make  

cd examples 

In the examples directory you will see many files, but the ones named unfoldDetectorToParticle...C or unfoldDetectorToParton...C are the most relavent to the cross-section. If you want to explore different unfolding techniques and other examples you can look at an older version of the RooUnfold code at the following: /star/u/gdwebb/DijetCode/RooUnfold-1.1.1/examples/
 
We are going to focus in detail on the following file, since it produces the premlinary and used for a final result:
 

/star/u/gdwebb/DijetCode/RooUnfold/examples/unfoldDetectorToParticleLevel.cxx

We start with declaring some global variables (not always in best practice) and functions

nbins --> The number invariant mass bins
nbins0 --> An array that defines the bin range 
comboALL --> A function to combine a 1D histogram from each partonic pT bin into one 1D histogram (very useful)
comboALL2 --> A function to combine a 2D
 histogram from each partonic pT bin into one 2D histogram (very useful)

The input into this macro are: 
TFile *g --> Access the Data file 
TFile *f[14] --> Access the Embedding File for each partonic pT bin. 
TFile *p[14] --> Access the Pythia Level Files for each partonic pT bin. (Needed for efficiency calculation) 

The next few lines indicate where the files are named and located. 

Following that are some for loops that determine the luminosity (Lumi) and Normalization (Norm) values. 

Double_t ue_had --> Hard coded array of values that accounts for the underlying event and hadronization. These values were calculated by substracting the cross section at the particle level and at the parton level.  While probably still relevant to determine hadronization, the new UE methods provide a method to determine UE effects. Therefore, this could lead to a way approximately determine the contribution from hadronization (interesting physics!!).

Following this is the Theory section. This calculates the systematic on the theory using different renormalization scales (2 times and 0.5). For a final result will need a new theory calculation and this code will be helpful as a guideline. So look it over, improve, and use as a guideline.

Then we access the histograms created from the dijet trees and finally put them to use to make cross section calculations.  

char datahistname[100] = "hImassJP2"; --> The name of the histogram in data. Look back at the read dijet macro to see how this histogram was made 

char detectorname[100] = "hmatchedImassJP2" --> detector level histogram

char pythianame[100] = "Imass_afCut" --> Pythia Particle Level histogram

char detpartname[100] = "hdpImass_JP2" --> A 2D histogram that maps detector level to particle level. The unfolding matrix

Then declare a few histogram pointer with the new command and then use the very useful comboALL functions to combine the partonic pT bins using the relative normalization values. 

Next is the very functional RooUnfold commands that will Train and Unfold the data to particle level. See the comments "TRAIN" and "UNFOLD". This macro used the SVD method (RooUnfoldSvd), but you can use any other method if you want to explore (i.e RooUnfoldBayes). 

Next we need to do some normalization corrections, since the luminosity was determined for the W analysis and the different trigger rates and different dead times. 

The rest of the code is most for plotting purposes. There are two arrays of hard code values that are relevant for you. Check out: sys_Plus and sys_minus, which are the quadrature sums of all the systematic errors for each Invariant mass bin (not including the luminosity systematic).  Also, you can see we nomalize by the phase-space, see the phase_space array.

Useful Links for RooUnfold is found here. In fact, there are many presentations located that here, which describes the process. The prelimenary result for the dijet cross-section can be found here (a good read!)

SYSTEMATICS:

Systematic errors, one of the most important and time consuming parts of an analysis. A measurement is only as good as the error associated with it and so we would like to have small but real errors. To reduce the statistical errors, you should repeat a measurement many times. This is obviously not the dominate error in this analysis, but the systematics are the dominate error. This error represents the uncertanity we have in our detectors and calibrations. Therefore, to calculate the systematics you have to repeat the entire process with a particular systematic applied (i.e. + 1 % track pT, meaning increasing the transverse momenta of charged tracks by 1%). You heard right, repeat every thing. When I got to the point of addressing systematics as a graduate student and learned I had to pull down my embedding, remake jet trees, apply the systematics and do EVERYTHING over I wanted to say #ByeFelicia.  However, it's not that bad. Repeating something is always easier than doing it for the first time. And if you've read this blog page, you've braced yourself for the cold water plunge that are systematics. 

And in fact you don't have to start from scratch. I have checked code into CVS (so you know its good) that will do the systematics for you. Bing Bang BOOM. Now here is the location of a macro, that will calculate systematics, enjoy: 

/star/u/gdwebb/RunJetFinder2009emb.C 

Note, you only can do systematic studies on the embedding, which is one of the main reasons to have an embedding sample.  In this macro, you will see code starting with: 

StAnaPars* anapars12_93 = new StAnaPars;
\*---- bunch of line with all the  TPC cuts ---- *\
anapars12_93->addTpcCut(new StjTrackCutRandomAccept(randomAccept); --> This systematic random throws away 7% of tracks. 

NOTE: This was very ineffective way to write the code (lots of un-needed lines). Look at how long it is with all the systematics. Instead, you can just declare a clone/copy of the original anapar12. For example: you could instead of making a new pointer to the class StAnaPars just copy the original: 

StAnaPars* anapars12_093 = new StAnaPars(*anapar12)

This will stream line your code (make it easier to read and easy to understand), but MAKE SURE systematic is behaving as expected before running over everthing. If you are increasing the tower energy by 5% you should see the jet tower energy increase and the overall jet pT should increase. TEST over a single file first. This concept will save you time and disk space. 

And I'm not going to leave you hanging, I'm pointing you to a JetFinder macro that I have  stream lined and also implements the Regional UE
 analysis:  

/star/u/gdwebb/UE_CVS_Check/RunJetFinder2009emb_pro.C

I wanted to point you the original macro that I used for the preliminary result, but feel free to implement the this macro as well. TEST EVERYTHING first. 

Word of Advice: Once you have your embedding created, get the systematics before running the jetfinder over the entire sample. Because there will be a seperate branch for each systematic in the jet trees. And you could possibly save yourself some time if you know the systematics are going to be implemented. Go to your Advisor and say, "Hey all-knowin advisor, I want to process the jet trees for my entire embedding sample. But before I do, what are the systematics so I can go ahead and make all the needed jet branches?" You may get some blank stares, but that's a'okay. They'll get back to you. 

Here you can find a talk which looks at each individual systematic before added in quadrature.  

 
EMBEDDING:

The embedding/simulation is needed for unfolding and calculating systematics. You don't have to do the embedding alone and you should work with the STAR embedding team to create it. The ultimate goal of the embedding is to create a simulation sample that properly mimics what is observed in the Data. So you will need to do Data/MC comparisons to verify that it behaves as expected. Now the Run 9 pp 500 embedding is unique because applies two seperate filters. 1) A dijet filter at the pythia level (probably not needed) and 2) a trigger filter. The location of how the Run 9 pp 500 was ran can be found at the following: 

/star/u/gdwebb/Dijet_pp500_2009_embedding

and in this directory you will find a file name HowToRunEmbedFilter, which describes the process. There are also some very useful macros for testing the embedding sample found at the following: 

/star/u/gdwebb/Dijet_pp500_2009_embedding/macros/

and they are usually start with the name "Make...C".

Typically, an embedding sample will take a lot of time and disk space. In fact, you will probably have to create the embedding sample in parts and store the files not needed for the analysis on HPSS. A very useful documentation on how to store information and how to pull down information can be found here. And you will need to be persistent and thorough when working on an embedding sample. The location of the Run 9 pp 500 GeV embedding is located at the following on HPSS: 

/home/gdwebb/pTBins/ 

and it is important to note that the lower partonic pT bins in the directory above have the dijet filter. The lower partonic pT bins without the dijet filter can be found at the following: 

/home/gdwebb/EmbeddingInclusiveJet/

There is a nice description of the embedding produced at this drupal page.

INCLUSIVE JET:

By now you are somehwat fimilar with the coding structure I
 implement to process data. The inclusive jet analyis is found at the following: 

/star/u/gdwebb/InclusiveJet/
 
and the code is similar to the Dijet Analysis. You will find the compiled code and the StInclusiveJetMaker at the following: 

/star/u/gdwebb/InclusiveJet/StRoot/StInclusiveJetMaker/

and the script and xml files are located at: 

/star/u/gdwebb/InclusiveJet/scripts/SubmitInclusivejets2009pp500.sh
/star/u/gdwebb/InclusiveJet/scripts/SubmitEmbedInclujets200pp500.sh
/star/u/gdwebb/InclusiveJet/scripts/SubmitAllpTbinsEmbedInclujets200pp500.sh
/star/u/gdwebb/InclusiveJet/scripts/SubmitPythiaInclusivejet2009pp500.sh

/star/u/gdwebb/InclusiveJet/scripts/RunEmbedInclusiveJetAnalysis.xml
/star/u/gdwebb/InclusiveJet/scripts/RunInclusiveJetAnalysis.xml
/star/u/gdwebb/InclusiveJet/scripts/RunPythiaInclusiveJetAnalysis.xml

and the macros to run the inclusive jet maker are located at: 

/star/u/gdwebb/InclusiveJet/macros/RunInclusiveJetCode.C
/star/u/gdwebb/InclusiveJet/macros/RunEmbedInclusiveJetCode.C
/star/u/gdwebb/InclusiveJet/macros/RunPythiaInclusiveJetCode.C 

The above code will produce inclusive jet trees similar to the dijet trees. And then you will need to read in the inclusive jet trees. Here are the Inclusive Jet Reader code:

/star/u/gdwebb/InclusiveJet/SubmitReadInclujetsAllpT.sh
/star/u/gdwebb/InclusiveJet/SubmitReadDataInclujetsAllpT.sh
/star/u/gdwebb/InclusiveJet/SubmitReadPythiaInclujetsAllpT.sh

which are scripts that run the following macros:

/star/u/gdwebb/InclusiveJet/macros/ReadInclusiveJetTrees.C 
/star/u/gdwebb/InclusiveJet/macros/ReadEmbedInclusiveJetTrees.C

/star/u/gdwebb/InclusiveJet/macros/ReadPythiaInclusiveJetTrees.C

At this point, you will have made the histograms required to do the hadd, then calculate the cross-section. There are some nice macros to do cross-checks and MC/Data comparisons found at the following:

/star/u/gdwebb/InclusiveJet/macros/PlottingMacros/

Check out the following code for the inclusive jet cross-section: 

/star/u/gdwebb/DijetCode/RooUnfold/examples/unfoldDetectorToParticleSVD_inclu.cxx
/star/u/gdwebb/DijetCode/RooUnfold/examples/unfoldDetectorToParticleSVD_inclu_uecorr.cxx

Note, the second macro listed about implements the underlying event correction and you should look into it. Again, this all very simliar to the dijet section described in more detail above. There is an inclusive counter-part for every dijet part. 

Finally, this drupal page should be used for guidance and does not mean you should simply take the code for your own. I truly hope this guide helps anyone at STAR wanting perform a cross-section analysis. The current status of the Inclusive Jet Cross Section can be found here.

UNDERLYING EVENT (UE): 

Currently, there are two methods used at STAR to extract the underlying event from data. There is a detailed blog post, which decribes the code that has been updated to CVS. My focus was on the regional method to extract the UE. It is designed to be a study of the UE and not just a correction to a different measurement. The location of the analysis can be found at the following: 

/star/u/gdwebb/UE_CVS_Check/

Again, I stick with a similar structure with a macros and scripts directory. The main scripts and macros are found at the following: 

/star/u/gdwebb//UE_CVS_Check/scripts/SubmitUEAnalysis2012pp500.sh
/star/u/gdwebb//UE_CVS_Check/scripts/SubmitEmbedDetectorParticleUEAnalysis2009pp500.sh
/star/u/gdwebb//UE_CVS_Check/scripts/SubmitPythiaUEAnalysis2009pp500.sh

/star/u/gdwebb/UE_CVS_Check/macros/RunUEAnalysis.C

/star/u/gdwebb/UE_CVS_Check/macros/RunSpinUEAnalysis.C
/star/u/gdwebb/UE_CVS_Check/macros/RunEmbedUEAnalysis.C
/star/u/gdwebb/UE_CVS_Check/macros/RunPythiaUEAnalysis.C
/star/u/gdwebb/UE_CVS_Check/macros/RunEmbedDetectorParticleUEAnalysis.C

and you can find some helpful plotting macros at the following: 

/star/u/gdwebb/UE_CVS_Check/macros/UEmacros/DataEmbdmacros/ 

Now I have pointed you to the official directory, that contains all the needed information to perform an UE analysis, but you can also find some useful information from the testing directory at the following: 

/star/u/gdwebb/UEpp500/

which by now you can easily explore cause I'm consistent in in my coding structure. You can find the status and preliminary results of the UE analysis here. 

THINGS TO DO:

Reduce systematics: For the preliminary result, a consevative estimate of  +-5% was used for the tower energy systematic. This can probably be lowered, but needs to be determined. In addition, there has been question of improving the 13% systematic on the Luminosity.  

Implement systematics on the inclusive jet cross section. 

Implement the UE corrections on dijet:  The cone UE correction has been applied to the inclusive cross section

Compare with more updated theorectical calculations

May the force be with you. And if you want the nitty gritty details you can read my thesis here.


p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo}
span.s1 {font-variant-ligatures: no-common-ligatures}

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #c33720}
span.s1 {font-variant-ligatures: no-common-ligatures}

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo; color: #c33720}
span.s1 {font-variant-ligatures: no-common-ligatures}
span.s2 {font-variant-ligatures: no-common-ligatures; color: #000000}

p.p1 {margin: 0.0px 0.0px 0.0px 0.0px; font: 11.0px Menlo}
span.s1 {font-variant-ligatures: no-common-ligatures}