Producing Woutuple from AOD

From Atlas Wiki
Jump to navigation Jump to search

Producing an Ntuple from an AOD

We present here the code used to process the AOD and dump (nearly) the complete content in an Ntuple. The package that produces an Ntuple from the AOD contains:

  • a clone of UserAnalysis/AnalysisSkeleton and
  • a clone of AnalysisExamples/ttbar
  • the simpleTTbar class that dumps a (nearly) complete ntuple with all AOD information needed for a ttbar analysis on full simulation Monte Carlo

Step 1: Set up ATLAS release 10.0.2 software environment

Make sure you have prepared the ATLAS environment. At NIKHEF we follow the next procedure: Set up ATLAS realease 10.0.2 environment

Step 2: Get the code

Go to the directory where you want to install the package and get the code: TTBarAnalysis.tgz

Step 3: Setting up the package

Note: CMT expects that in this top directory where the package is located is in your CMTPATH. To do this, go back to the top directory and issue the command (in csh): setenv CMTPATH `pwd`:${CMTPATH}

  1. Go to the directory where you want to install the package and unpack the code: tar -xzvf TTBarAnalysis.tgz The package will appear in a subdirectory named TTBarAnalysis of the chosen directory.
  2. Go to the cmt directory: cd TTBarAnalysis/cmt
  3. Execute cmt config
  4. Execute source setup.csh
  5. Build the library: gmake

Setp 4: Running the package (an Ntuple from an AOD)

The athena job requires as input the AOD files (also registered in the poolfile catalogue), the names of the ParticleContainers in the AOD and the name of the output Ntuple. As an example we'll process a single file called rome.004100.recov10.T1_McAtNLO_top._00001.myaod.pool.root that is located in the directory /data/atlas/public/Rome/T1/AOD/AOD_Eric/0000/.

To process it just follow the next steps:

1) If not already done before, go to the cmt directory and execute source setup.csh

2) Go to the run directory: cd /TTBarAnalysis/run

3) Create a directory called AOD that is a link to the top directory where your AOD's are located. When handling large sets of AODs this is much more convenient than copying every AOD file to the run directory.

At NIKHEF we would do: ln -s /data/atlas/public/Rome/T1/AOD/AOD_Eric/ AOD

4) Insert the files into the pool file catalog: pool_insertFileToCatalog AOD/0000/rome.004100.recov10.T1_McAtNLO_top._00001.myaod.pool.root

5) Before launching the athena job, edit the joboptions file (simpleTTbar_joboptions.py):

  • Set the Type of simulation (SimType, see example in joboptions file for details)
  • List your AOD file as the input AOD-file (EventSelector.InputCollections)
  • Enter the filename of the Ntuple that will be produced (NtupleSvc.Output)

6) Execute: athena.py simpleTTbar_jobOptions.py

An Ntuple called TopNtuple_V6_dummy.root is produced that can be analysed using the Analysis Skeleton (check the link at the bottom of the page).

Step 6: Extra's

Note 1: When processing (large) sets of files Since all POOL files processed by Athena should be registered in the PoolFile Catalog. Wildcards can be used and you can register before processing the files ALL your AODs into the catalogue using: pool_insertFileToCatalog AOD/*/*AOD.pool.root

Note 2: To get a bit more information on what is in the AOD and the full code and description of the Analysis Skeleton please check the other wiki pages at NIKHEF on how to: Analyse the Ntuple using the Analysis Skeleton and the Ntuple content.