Difference between revisions of "Producing Woutuple from AOD"
(13 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
== Producing an Ntuple from an AOD == | == Producing an Ntuple from an AOD == | ||
− | + | We present here the code used to process the AOD and dump (nearly) the complete | |
+ | content in an Ntuple. The package that produces an Ntuple from the AOD contains: | ||
*a clone of <tt>UserAnalysis/AnalysisSkeleton</tt> and | *a clone of <tt>UserAnalysis/AnalysisSkeleton</tt> and | ||
*a clone of <tt>AnalysisExamples/ttbar</tt> | *a clone of <tt>AnalysisExamples/ttbar</tt> | ||
− | * the <tt>simpleTTbar</tt> class that dumps a (nearly) complete ntuple with all AOD information needed for a ttbar analysis on full simulation Monte Carlo | + | *the <tt>simpleTTbar</tt> class that dumps a (nearly) complete ntuple with all AOD information needed for a ttbar analysis on full simulation Monte Carlo |
== Step 1: Set up ATLAS release 10.0.2 software environment == | == Step 1: Set up ATLAS release 10.0.2 software environment == | ||
− | Make sure you have prepared the ATLAS environment: | + | Make sure you have prepared the ATLAS environment. |
− | [[atlas_1002_setup | | + | At NIKHEF we follow the next procedure: |
+ | [[atlas_1002_setup | Set up ATLAS realease 10.0.2 environment]] | ||
== Step 2: Get the code == | == Step 2: Get the code == | ||
Line 17: | Line 19: | ||
== Step 3: Setting up the package == | == Step 3: Setting up the package == | ||
+ | |||
+ | <b><font color="red">Note:</font></b> | ||
+ | CMT expects that in this top directory where the package is located is in your CMTPATH. To do this, go back to the top directory and issue the command (in csh): <tt>setenv CMTPATH `pwd`:${CMTPATH}</tt> | ||
# Go to the directory where you want to install the package and unpack the code: <tt>tar -xzvf TTBarAnalysis.tgz</tt> The package will appear in a subdirectory named <tt>TTBarAnalysis</tt> of the chosen directory. | # Go to the directory where you want to install the package and unpack the code: <tt>tar -xzvf TTBarAnalysis.tgz</tt> The package will appear in a subdirectory named <tt>TTBarAnalysis</tt> of the chosen directory. | ||
Line 23: | Line 28: | ||
# Execute <tt>source setup.csh</tt> | # Execute <tt>source setup.csh</tt> | ||
# Build the library: <tt>gmake</tt> | # Build the library: <tt>gmake</tt> | ||
− | |||
− | |||
− | |||
− | |||
== Setp 4: Running the package (an Ntuple from an AOD) == | == Setp 4: Running the package (an Ntuple from an AOD) == | ||
− | The athena job requires as input the AOD files (also registered in the | + | The athena job requires as input the AOD files (also registered in the poolfile catalogue), the names of the ParticleContainers in the AOD and the name of the output Ntuple. As an example we'll process a single file called <font color=blue> [http://www.nikhef.nl/pub/experiments/atlas/public/Rome/T1/AOD/AOD_Eric/0000/rome.004100.recov10.T1_McAtNLO_top._00001.myaod.pool.root rome.004100.recov10.T1_McAtNLO_top._00001.myaod.pool.root] </font> that is located in the directory /data/atlas/public/Rome/T1/AOD/AOD_Eric/0000/. |
To process it just follow the next steps: | To process it just follow the next steps: | ||
Line 52: | Line 53: | ||
6) Execute: <tt>athena.py simpleTTbar_jobOptions.py</tt> | 6) Execute: <tt>athena.py simpleTTbar_jobOptions.py</tt> | ||
− | An Ntuple called | + | An Ntuple called TopNtuple_V6_dummy.root is produced that can be analysed using the Analysis Skeleton (check the link at the bottom of the page). |
== Step 6: Extra's == | == Step 6: Extra's == | ||
<b>Note 1:</b> | <b>Note 1:</b> | ||
− | Since all POOL files processed by Athena should be registered in the PoolFile Catalog | + | When processing (large) sets of files Since all POOL files processed by Athena should be registered in the PoolFile Catalog. Wildcards can be used and you can register before processing the files ALL your AODs into the catalogue using: <tt>pool_insertFileToCatalog AOD/*/*AOD.pool.root</tt> |
<b>Note 2:</b> | <b>Note 2:</b> | ||
− | + | To get a bit more information on what is in the AOD and the full code and description of the Analysis Skeleton please check the other wiki pages at NIKHEF on how to: [[ttbar_analysis_skeleton | Analyse the Ntuple using the Analysis Skeleton]] and the [[aod_ntuple | Ntuple content]]. |
Latest revision as of 09:22, 26 September 2005
Producing an Ntuple from an AOD
We present here the code used to process the AOD and dump (nearly) the complete content in an Ntuple. The package that produces an Ntuple from the AOD contains:
- a clone of UserAnalysis/AnalysisSkeleton and
- a clone of AnalysisExamples/ttbar
- the simpleTTbar class that dumps a (nearly) complete ntuple with all AOD information needed for a ttbar analysis on full simulation Monte Carlo
Step 1: Set up ATLAS release 10.0.2 software environment
Make sure you have prepared the ATLAS environment. At NIKHEF we follow the next procedure: Set up ATLAS realease 10.0.2 environment
Step 2: Get the code
Go to the directory where you want to install the package and get the code: TTBarAnalysis.tgz
Step 3: Setting up the package
Note: CMT expects that in this top directory where the package is located is in your CMTPATH. To do this, go back to the top directory and issue the command (in csh): setenv CMTPATH `pwd`:${CMTPATH}
- Go to the directory where you want to install the package and unpack the code: tar -xzvf TTBarAnalysis.tgz The package will appear in a subdirectory named TTBarAnalysis of the chosen directory.
- Go to the cmt directory: cd TTBarAnalysis/cmt
- Execute cmt config
- Execute source setup.csh
- Build the library: gmake
Setp 4: Running the package (an Ntuple from an AOD)
The athena job requires as input the AOD files (also registered in the poolfile catalogue), the names of the ParticleContainers in the AOD and the name of the output Ntuple. As an example we'll process a single file called rome.004100.recov10.T1_McAtNLO_top._00001.myaod.pool.root that is located in the directory /data/atlas/public/Rome/T1/AOD/AOD_Eric/0000/.
To process it just follow the next steps:
1) If not already done before, go to the cmt directory and execute source setup.csh
2) Go to the run directory: cd /TTBarAnalysis/run
3) Create a directory called AOD that is a link to the top directory where your AOD's are located. When handling large sets of AODs this is much more convenient than copying every AOD file to the run directory.
At NIKHEF we would do: ln -s /data/atlas/public/Rome/T1/AOD/AOD_Eric/ AOD
4) Insert the files into the pool file catalog: pool_insertFileToCatalog AOD/0000/rome.004100.recov10.T1_McAtNLO_top._00001.myaod.pool.root
5) Before launching the athena job, edit the joboptions file (simpleTTbar_joboptions.py):
- Set the Type of simulation (SimType, see example in joboptions file for details)
- List your AOD file as the input AOD-file (EventSelector.InputCollections)
- Enter the filename of the Ntuple that will be produced (NtupleSvc.Output)
6) Execute: athena.py simpleTTbar_jobOptions.py
An Ntuple called TopNtuple_V6_dummy.root is produced that can be analysed using the Analysis Skeleton (check the link at the bottom of the page).
Step 6: Extra's
Note 1: When processing (large) sets of files Since all POOL files processed by Athena should be registered in the PoolFile Catalog. Wildcards can be used and you can register before processing the files ALL your AODs into the catalogue using: pool_insertFileToCatalog AOD/*/*AOD.pool.root
Note 2: To get a bit more information on what is in the AOD and the full code and description of the Analysis Skeleton please check the other wiki pages at NIKHEF on how to: Analyse the Ntuple using the Analysis Skeleton and the Ntuple content.