MCatNLO Step 3

From Atlas Wiki
Jump to navigation Jump to search

On this page we will use the <>.event files that were produced by MCatNLO and use Herwig (using the Athena framework) to decay the tops and produce the parton shower. We will produce a combined ntuple CBNT and POOL file.

Main idea:

The main idea is to call athena with a joboptions file where you specify the location of the event file. In addition it is required to have an additional input file. We will again use a script that will take the BASIC forms for these input files and prodice unique input files for each job and take care of the bookkeeping.

1) Set-up Athena and output directories

We will use Athena to do the decay. Make sure that you have correctly set up the athena environment. To see how to do this please check xxx.

cd /MyMCatNLO
mkdir POOLCBNT/InputAndLogfiles

2) Get the basic input files

Since we have used MCatNLO to produce the and momenta, we have to signal to Herwig that it must these from your external <>.event-file. In Athena, this is done inside a joboptions file. Again we start at again to your MCatNLO working directory area. Create two new directories where we'll store the basic input files (basic joboptions file and other athena related files) and the output files (the combined ntuple for the generator information and POOL files).


(a) Get some Athena python-files and check set-up

Get these athena python files into your BASICS area using get_files.

get_files PDGTABLE.MeV

This is a good check to see if you have set-up athena correctly. If this fails please make sure you have the Athena envoronment set-up correctly.

(b) Get the basic joboptions file

Get the basic joboptions file:

The lines that will be changed by the script are related to the number of events, the random number seeds for the generators and the output files.

theApp.EvtMax = XNEVENTSX
NTupleSvc.Output     = [ "FILE1 DATAFILE='./XOutputDirCBNTX/MCatNLO.CBNT.JobXJOBNUMBERX.root' OPT='NEW'" ]
Stream1.OutputFile = "./XOutputDirPOOLX/MCatNLO.POOL.JobXJOBNUMBERX.root"

(c) Get the inparmMcAtNlo.dat (MCatNLO parameter) file

When starting the athena job, a file called inparmMcAtNlo.dat is expected in your run directory. It contains the location of the <>.event file and some additional input variables related to the generated MCatNLO sample.

Get the additional MCatNLO parameter file: inparmMcAtNlo.dat.BASIC

The lines that will be changed by the script are related to the number of events, the random number seeds for the generators and the output files.

 './XINPUTDIRX/XINPUTNAMEX'     ! event file
  XNEVENTSX                     ! number of events in the event file

3) Get the script and taylor it to your needs

The last step is to get the script and start the jobs. For each job it will:

o create a directory called Job<job#>
o create two unique input files
o copy all otehr nexessary files here and start the job
o copy output files (CBNT,POOL and Logfiles) to an output directory
o remove directory Job<job#>

(a) Get the script

Go to your working area:

cd /MyMCatNLO/

and get the python script:

(b) Tailor the script to your environment

In the script you need to define the variables related to the input and output directories. These are defined in the first "routine" in the script.

input_dir           = "/data/atlas/users/ivov/MyMCatNLO/MATRIXELEMENT/"
output_dir          = "/data/atlas/users/ivov/MyMCatNLO/POOLCBNT/"
steering_files_dir  = "/data/atlas/users/ivov/MyMCatNLO/ATHENA_BASICS/"

At the same place in the script also parameters related to the number of events, the number of jobs etc have to be set.

Nevents_eventfile   =     500
Nevents_joboptions  =      20
Njobs               =       2
f_LogFile           =       0

Important: The variable called Nevents_eventfile is linked to the number of events that are in the <>.events files. Make sure therefore that this number is the same as in the header of the <>.events file (should be made automatic). The variable Nevents_joboptions controls the number of events that will be processed.

4) Run the script

Once the number of events and the number of jobs are defined start the run:


Once you get it running for a single job, set the parameter f_Logfile=1 and Njobs=2 to produce two files where the logfile is automatically written to the output directory.

You are finished

What did we produce ??

[ivov@njord MyMCatNLO]$ ls -lrt ./POOLCBNT/*
-rw-r--r--    1 ivov     atlas     2511240 May 10 11:38 ./POOLCBNT/MCatNLO.POOL.Job1.root
-rw-r--r--    1 ivov     atlas     1044531 May 10 11:38 ./POOLCBNT/MCatNLO.CBNT.Job1.root
-rw-r--r--    1 ivov     atlas     2429914 May 10 11:38 ./POOLCBNT/MCatNLO.POOL.Job2.root
-rw-r--r--    1 ivov     atlas     1014065 May 10 11:38 ./POOLCBNT/MCatNLO.CBNT.Job2.root

total 540
-rw-r--r--    1 ivov     atlas      250943 May 10 11:38 Logfile.Job1.log
-rw-r--r--    1 ivov     atlas        5114 May 10 11:38
-rw-r--r--    1 ivov     atlas         492 May 10 11:38 inparmMcAtNlo.dat.Job1
-rw-r--r--    1 ivov     atlas      263851 May 10 11:38 Logfile.Job2.log
-rw-r--r--    1 ivov     atlas        5114 May 10 11:38
-rw-r--r--    1 ivov     atlas         492 May 10 11:38 inparmMcAtNlo.dat.Job2

To POOL files you have produced can then be transformed to AOD after full reconstruction or Atlfast.