Difference between revisions of "Generating Higgs To 4 Muons at NIKHEF"

From Atlas Wiki
Jump to navigation Jump to search
 
(53 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
An exercise to simulated Higgs production events at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons.
 +
 
<center>
 
<center>
 
<math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math>
 
<math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math>
 
</center>
 
</center>
  
Produce simulated events from Higgs production at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons.  It is ment as a starting point for the usual monkey-see monkey-do technique. In this example we will also use AtlFast so we can see what would be observed in ATLAS. We will produce a combined ntuple (CBNT) that contains the MC truth and reconstructed AtlFast objects.
+
The exercise is ment as a starting point for the 'monkey-see monkey-do' technique. It will be easy to plug in your own favorite process. In this example we will use AtlFast for the detector simulation and reconstruction. We will produce an AOD that contains the MC truth and reconstructed AtlFast objects. Since the AOD is in pool format we will also transform the AOD into an Ntuple that allows a simple analysis program to be constructed in Root.
The next step will be to produce an ESD/AOD (with full simulation and reconstruction).
 
  
 
Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung [[CMT_and_Athena_at_NIKHEF | Starting with CMT and Athena at NIKHEF]]
 
Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung [[CMT_and_Athena_at_NIKHEF | Starting with CMT and Athena at NIKHEF]]
  
== 1. producing 10 events ==
+
== 1) Setting up the ATLAS environment at NIKHEF ==
  
:'''a) Go to your favorite area (your project for example) and create a running directory'''
+
Some packages are required to get the ATLAS software environment ok. As a first time user you should follow steps a) and b). Every time you log on you only have to process c).
  
: <font color=red> cd /project/atlas/users/<your_login_name> </font>
+
<b>a) Setting up the general ATLAS environment at NIKHEF (first time only)</b>
: <font color=red> mkdir MyGeneration </font>
 
  
:'''b) Create your joboptions file'''
+
For a fast start follow the following steps:
  
: Each athena job requires a joboptions file as input. Here we will create a joboption file that will:
+
*Login to a SLC3 machine and: <tt>source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh</tt>
  
:* Define what 'algorithms to run (in our case Pythia and Atlfast)
+
<font color="red"><b>Note:</b></font> If your directory on the project disk is different from your login name you should tell the setup script. Somebody who's login name is 'Tommie', but wants to do all his ATLAS work under /project/atlas/users/pino should use: <tt>source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh opt slc3 pino</tt>.
:* Define the Pythia settings
 
:* Define output parameters/ntuples
 
  
:Open a file called <font color-=blue>joboptions_HiggsGeneration.py</font> and fill it with the following code
+
* Get the TestRelease (with some modifications: check the detailed description)
 +
# Go to your project directory: <tt> cd /project/atlas/users/<your_login_name> </tt>
 +
# Check out the <tt>TestRelease</tt> package from the NIKHEF/ATLAS CVS repository: <tt>cvs -d /project/atlas/cvs co  TestRelease</tt>  
 +
# Go to the cmt directory: <tt>cd TestRelease/TestRelease-00-00-18/cmt</tt>
 +
# Execute <tt>cmt config</tt>
 +
# Execute <tt>source setup.csh</tt>
  
<font size ="-1" color=blue>
+
For a detailed description please follow the instructions on: [http://www.nikhef.nl/pub/experiments/atlaswiki/index.php/Atlas_1002_setup ATLAS setup at NIKHEF].
<pre>
 
###############################################################
 
#
 
# Job options file
 
#
 
#==============================================================
 
  
#--------------------------------------------------------------
 
# General Application Configuration options
 
#--------------------------------------------------------------
 
include("AthenaCommon/Atlas_Gen.UnixStandardJob.py")
 
  
theApp.setup( MONTECARLO )
+
<b>b) Setting up the Package required to produce Ntuples from the AOD (first time only)</b>
  
include( "PartPropSvc/PartPropSvc.py" )
+
To produce Ntuples from an AOD you'll need to add an additional package created at NIKHEF.
include( "AtlfastStandardOptions.py" )
 
  
#--------------------------------------------------------------
+
# Go to your project directory: <tt> cd /project/atlas/users/<your_login_name> </tt>
# Private Application Configuration options
+
# Check out the <tt>TTBarAnalysis</tt> package from the NIKHEF/ATLAS CVS repository: <tt>cvs -d /project/atlas/cvs co TTBarAnalysis</tt>
#--------------------------------------------------------------
+
# Go to the cmt directory: <tt>cd TTBarAnalysis/cmt</tt>
theApp.DLLs  += [ "TruthExamples", "Pythia_i" ]
+
# Execute <tt>cmt config</tt>
theApp.Dlls  += [ "GaudiAlg", "GaudiAud" ]
+
# Execute <tt>source setup.csh</tt>
theApp.Dlls  += [ "AtlfastAlgs" ]
+
# Build the library: <tt>gmake</tt> (Note: you might have to do gmake twice)
  
# Set output level threshold (2=DEBUG, 3=INFO, 4=WARNING, 5=ERROR, 6=FATAL )
+
You can also get a more detailed set of instructions from [http://www.nikhef.nl/pub/experiments/atlaswiki/index.php/Running_ttbar_package Installing the AOD->Ntuple (TTBarabalysis) package].
MessageSvc = Service( "MessageSvc" )
 
MessageSvc.OutputLevel = 3
 
MessageSvc.defaultLimit = 9999999
 
  
#--------------------------------------------------------------
+
Once this is set-up you can produce TopNtuples from an AOD if you wish to do so.
# Create the top level algorithm (and the Python equivalent)
 
#--------------------------------------------------------------
 
theApp.TopAlg = [ "Sequencer/TopSequencer" ]
 
TopSequencer  = Algorithm( "TopSequencer" )
 
TopSequencer.StopOverride = TRUE
 
  
# Create two paths and populate them
+
<b>c) Setting up all required packages ( every time, but not if you have just done a) and b) )</b>
TopSequencer.Members  = [ "Sequencer/Generator", "Sequencer/Atlfast" ]
 
  
Generator = Algorithm( "Generator" )
+
On every login you should now make sure the shell knows where to get the various programs,
Generator.Members = [ "Pythia", ]
+
which means both the ATLAS general and the Ntuple Make program. You can do this by simply sourcing a script similar to [http://www.nikhef.nl/~ivov/init1002.csh init1002.csh].
 +
Simply source it in every window where you want to do the generation:  <tt> source init1002.csh </tt>
  
#--------------------------------------------------------------
+
<font color="red"><b>Note:</b></font> Again, ... for those of you whose directory on the project disk is different from your login name you should tell the setup script. Edit the init1002.csh file and add the 3 additional parameters to the line in which the general ATLAS setup script in 'sourced'. Look for example in [http://www.nikhef.nl/~ivov/init1002_special.csh init1002_special.csh].
# Event related parameters
 
#--------------------------------------------------------------
 
# Number of events to be processed (default is 10)
 
theApp.EvtMax = 10
 
#theApp.EvtMax = XNEVENTSX
 
  
#--------------------------------------------------------------
+
== 2) Generating Higgs events decaying into 4 muons ==
# Random numbers
 
#--------------------------------------------------------------
 
AtRndmGenSvc = Service( "AtRndmGenSvc" )
 
AtRndmGenSvc.Seeds = ["PYTHIA 5769791 690419913", "PYTHIA_INIT 690501 4106941"];
 
#AtRndmGenSvc.Seeds = ["PYTHIA XRNDPYTHIA0X XRNDPYTHIA1X", "PYTHIA_INIT XRNDPYTHIA2X XRNDPYTHIA3X"]
 
  
#--------------------------------------------------------------
+
'''a) Download the scripts'''
# Algorithms Private Options (Pythia)
 
#--------------------------------------------------------------
 
Pythia = Algorithm( "Pythia" )
 
  
#-- Stolen from DC1 production (simple test)
+
Go again to your project area and check out the Higgs4MuonAnalysis package from the NIKHEF/ATLAS CVS repository:
Pythia.PythiaCommand = [
+
: <font color=red> cd /project/atlas/users/<your_login_name> </font>
          # Higgs production
+
: <font color=red> cvs -d /project/atlas/cvs co Higgs4MuonAnalysis </font>
                        "pysubs msel 16",   # Higgs production
+
: <font color=red> cd Higgs4MuonAnalysis </font>
                "pydat2 pmas 25 1 150.",  # Higgs mass                       
+
Let's have a look at what files are in the package.
 
 
          # Higgs decay
 
                "pydat3 mdme 210 1 0",  # H -> dd
 
                "pydat3 mdme 211 1 0",  # H -> uu
 
                        "pydat3 mdme 212 1 0",  # H -> ss
 
                        "pydat3 mdme 213 1 0",  # H -> cc
 
                        "pydat3 mdme 214 1 0",  # H -> bb
 
                        "pydat3 mdme 215 1 0",  # H -> tt
 
                        "pydat3 mdme 218 1 0",  # H -> e+e-
 
                        "pydat3 mdme 219 1 0",  # H -> mu+mu-
 
                        "pydat3 mdme 220 1 0",  # H -> tau+tau-
 
                        "pydat3 mdme 222 1 0",  # H -> gluon gluon
 
                        "pydat3 mdme 223 1 0",  # H -> gamma gamma
 
                        "pydat3 mdme 224 1 0",  # H -> gluon + gamma
 
                        "pydat3 mdme 225 1 1",  # H -> ZZ              (ON)   
 
                        "pydat3 mdme 226 1 0",  # H -> WW
 
          # Z decay
 
                        "pydat3 mdme 174 1 0",  # Z -> dd
 
                        "pydat3 mdme 175 1 0",  # Z -> uu
 
                        "pydat3 mdme 176 1 0",  # Z -> ss
 
                        "pydat3 mdme 177 1 0",  # Z -> cc
 
                        "pydat3 mdme 178 1 0",  # Z -> bb
 
                        "pydat3 mdme 179 1 0",  # Z -> tt
 
                        "pydat3 mdme 182 1 0",  # Z -> e+e-          (ON)
 
                        "pydat3 mdme 183 1 0",  # Z -> nu_e nu_e
 
                        "pydat3 mdme 184 1 1",  # Z -> mu+mu-        (ON)
 
                        "pydat3 mdme 185 1 0",  # Z -> nu_mu nu_mu
 
                        "pydat3 mdme 186 1 0",  # Z -> tau+tau-
 
                        "pydat3 mdme 187 1 0"  # Z -> nu_tau nu_tau 
 
                        ]
 
 
 
#--------------------------------------------------------------
 
# What do you want in the CBNT
 
#--------------------------------------------------------------
 
include( "CBNT_Athena/CBNT_Athena_jobOptions.py" )
 
include( "CBNT_Athena/CBNT_EventInfo_jobOptions.py" )
 
include( "RecExCommon/CBNT_Truth_jobOptions.py" )
 
CBNT_Truth.MaxNbParticles = 6000              # maximum number of particles in the ntuple
 
CBNT_Truth.MaxNbVertices  = 6000              # maximum number of vertices in the ntuple
 
CBNT_Athena.NtupleLocID = "/FILE1/CBNT/t3333"  # name of the Tree
 
All.Enable = True                              # save ALL particles
 
 
 
#--------------------------------------------------------------
 
# (Root) output service
 
#--------------------------------------------------------------
 
# ROOT Output Parameters
 
theApp.Dlls += [ "RootHistCnv" ]
 
# Select ROOT persistency
 
theApp.HistogramPersistency = "ROOT"
 
 
 
#--------------------------------------------------------------
 
# Output file for the Ntuple
 
#--------------------------------------------------------------
 
NTupleSvc = Service( "NTupleSvc" )
 
NTupleSvc.Output    = [ "FILE1 DATAFILE='./HiggsNtuple.root' OPT='NEW'" ]
 
#NTupleSvc.Output    = [ "FILE1 DATAFILE='./XOutputDirCBNTX/HiggsNtuple.JobXJOBNUMBERX.root' OPT='NEW'" ]
 
 
 
#==============================================================
 
#
 
# End of job options file
 
#
 
###############################################################
 
</pre>
 
</font>
 
 
 
:'''b) Get additional steering files'''
 
 
 
:There are some requires input files that you have to get. The way to obbtain them is using the command <font color=red>get_files</font>.
 
 
 
:<font color=red>get_files PDGTABLE.MeV</font>
 
:<font color=red>get_files PartPropSvc.py</font>
 
:<font color=red>get_files AtlfastStandardOptions.py</font>
 
 
 
 
 
:'''c) Run Athena'''
 
 
 
:Running Athena is now a single line: <font color=red>athena.py joboptions_HiggsGeneration.py</font>
 
 
 
:You can ask athena to print out al commands it is processing:
 
:Full output of what athena is doing: <font color=red>athena.py -bs joboptions_HiggsGeneration.py</font>
 
 
 
 
 
:'''d) Check the output'''
 
 
 
: In the directory now an output file called <font color=blue>HiggsNtuple.root </font> has been produced. When you look in the file you'll find the Tree for the CBNT (MC Truth) and that for AtlFast (Event in ATLAS). Now you only have to write a ROOT macro to read the file and do your analysis.
 
 
 
:Finished!
 
 
 
== 2. producing 10,000 events in 10 sets of 1000 ==
 
 
 
When using the joboptions file in the example above you have produced 10 events, but when you want to have a bit large production you want to automatise everything a bit:
 
 
 
Create a new joboption file for each job each having
 
* A unique random number seed for Pythia
 
* A user defined number of events
 
* An output ntuple that is different for each event
 
* Store output and Logfiles in a separate directory
 
  
To do just this a small script has been created.
+
Athena requires steering files telling it what to do. These files are called joboptions files and
 +
since this exercise is made up of 2 steps we have 2 (basic) joboptions files. For there rest we have the script and some extra strange file required by Athena:
  
:'''a) Create a BASICS-directory'''
+
#<font color=blue>jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py</font> joboptions for: Pythia -> AOD:      
+
#<font color=blue>jobOptions_AOD_to_Ntuple_BASIC.py</font>  joboptions for: AOD -> TopNtuple 
: Create a directory with the basic information you require.
+
#<font color=blue> ShipOff_Pythia.py</font> The script that generates events
  <font color=red>  
+
#<font color=blue> PDGTABLE.MeV</font> A steering file required for MC production in Athena (not to be edited)
  cd /project/atlas/users/<your_login_name>/MyGeneration/
 
  mkdir HiggsGen_BASICS
 
  cp PDGTABLE.MeV              ./HiggsGen_BASICS/
 
  cp PartPropSvc.py             ./HiggsGen_BASICS/
 
  cp AtlfastStandardOptions.py ./HiggsGen_BASICS/
 
  </font>
 
:We also copy the joboptions file there and rename it
 
  <font color=red>  
 
  cp joboptions_HiggsGeneration.py ./HiggsGen_BASICS/joboptions_HiggsGeneration.py.BASIC
 
  </font>
 
  
:'''b) Edit the standard joboptions file'''
 
  
To allow the script to change the job-dependent settings in the joboptions file we'll now have to change 3 lines in the joboptions file.
+
'''b) Options in the script'''
  
:1) Change Number of events
+
The script takes three arguments:
<font color=blue>
 
theApp.EvtMax = 10 </font> .    changes to ->
 
  
<font color=blue>theApp.EvtMax = XNEVENTSX</font>
+
# <Nevents> = The number of events per job
 +
# <Njobs> = the number of jobs
 +
# <f_interactive> = a flag to signal that you want everything on screen (1) instead of logfile (0, default)
  
:2) Change Random number seeds for Pythia
+
The script is called using: <tt>./ShipOff_Pythia.py <Nevents> <Njobs> <f_interactive> </tt>
<font color=blue>AtRndmGenSvc.Seeds = ["PYTHIA 5769791 690419913", "PYTHIA_INIT 690501 4106941"]</font>.    changes to ->
 
  
<font color=blue>
+
What does the script do. For each job a subdirectory is made called Jobs<JobNr>. In that directory
AtRndmGenSvc.Seeds = ["PYTHIA XRNDPYTHIA0X XRNDPYTHIA1X", "PYTHIA_INIT XRNDPYTHIA2X XRNDPYTHIA3X"]
+
the joboption files specific to this job are created and Athena is run for both steps. The output files
</font>
+
(AOD and TopNtuple) are all stored in that directory.
  
:3) Change Name output file (and output directory)
 
<font color=blue>NTupleSvc.Output    = [ "FILE1 DATAFILE='./HiggsNtuple.root' OPT='NEW'" ]</font>.    changes to ->
 
  
<font color=blue>
+
'''b) Produce 9 events in 1 job in interactive mode'''
NTupleSvc.Output    = [ "FILE1 DATAFILE='./XOutputDirCBNTX/HiggsNtuple.JobXJOBNUMBERX.root' OPT='NEW'"]
 
</font>
 
  
:'''c) Create an output directory (Ntuples and Logfiles)'''
+
:<font color=red> ./ShipOff_Pythia.py 9 1 1 </font>
  
  To store the ntuples and Logfiles we create an output directory
+
Once the run is finished you can find all input and output files
  <font color=red>
+
in the sub-directory Job1.
  cd /project/atlas/users/<your_login_name>/MyGeneration/
 
  mkdir HiggsGen_OUTPUT
 
  mkdir HiggsGen_OUTPUT/InputAndLogfiles
 
  </font>
 
  
:'''d) Get the script and tailor it to your needs'''
+
Input files:
 +
:./Job1/<font color=blue>jobOptions_Pythia_To_Atlfast_To_AOD_Job1.py</font>
 +
:./Job1/<font color=blue>jobOptions_AOD_to_Ntuple_Job1.py</font>
  
:First copy the main script to your running directory
+
Output files:
 +
:./Job1/<font color=blue>AOD.Job1.pool.root</font>
 +
:./Job1/<font color=blue>TopNtupleV6.Job1.root</font>
  
  <font color=red>
 
  cd /project/atlas/users/<your_login_name>/MyGeneration/
 
  cp /user/ivov/Higgs_Tutorial_Files/ShipOff_HiggsGen.py .
 
  </font>
 
  
: The main user control flags that need to be edited are listed at the bottom of the script where you see: <font color=blue> User control flags</font>
+
'''c) Produce 1,000 events in 2 jobs of 500 events using LogFiles'''
<font color=blue>
 
<pre>
 
output_dir          = "/project/atlas/users/<your_login_name>/MyGeneration/HiggsGen_OUTPUT/"
 
steering_files_dir  = "/project/atlas/users/<your_login_name>/MyGeneration/HiggsGen_BASICS/"
 
Nevents_joboptions  =      20  # Number of events per job
 
Njobs              =      2 # Number of jobs
 
f_LogFile          =      0  # Logfile yes/no
 
</pre>
 
</font>
 
  
:By default what will happen is that for job 1, a directory called Job1 is produced that contains the files from HiggsGen_OUTPUT/ and a unique joboptions file (20 events with a unique random number sequence for Pythia). In the directory a link is put to the HiggsGen_BASICS/ directory. After the job is finished the Ntuple called HiggsNtuple.Job1.root is put in that directory. For job number 2 a similar thing happened.
+
:<font color=red> ./ShipOff_Pythia.py 500 2 </font>
  
:'''e) The real thing'''
+
<b>Note:</b> You will again put everything in the subDirectory Job1, so if it still exists you will have to rename it or remove it first.
  
:Now, once this is running, you might want to change 2 more things.
+
Once the run is finished you can find in the output files in Job1 and Job2 where not only the AOD and TopNtuple are located, but also the LogFiles for the Athena run for both steps.
  
:First, you should opt for the automatic logfile:
+
Finished! You have now produced 1,000 events with <math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math>.
<font color=blue>
 
<pre>
 
f_LogFile          =      1  # Logfile yes/no
 
</pre>
 
</font>
 
  
:Then you should remove the # from the line
 
<font color=blue>
 
<pre>
 
#CleanUp_Job(i_file)
 
</pre>
 
</font>
 
  
:This will make that at the end of the job both the logfile and the joboptions file for this job will be copied to the directory "HiggsGen_OUTPUT/InputAndLogfiles" and that the directory is removed.
+
'''d) Extra: Choosing a different Physics Process:'''
  
 +
The Pythia settings that define the process that is generated is given in the file jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py. If you want to study a different process: simply edit this file and insert your set of pythia parameters.
  
Finally, chaning the Number of events to 1000 and the number of jobs to 10, you will produce 10,000 events with <math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math>.
+
== 3) Analysing the content of the Ntuple ==
 +
To analyse the content of the Ntuple you can either do a MakeClass() yourself or use the
 +
Skeleton that was developed at NIKHEF to easily get a handle on the mainobjects and to
 +
perform an analysis. It is used in the ATLAS top group and can be found at
 +
[http://www.nikhef.nl/pub/experiments/atlaswiki/index.php/Ttbar_analysis_skeleton TopNtuple Analysis Skeleton]

Latest revision as of 15:51, 21 February 2006

An exercise to simulated Higgs production events at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons.

The exercise is ment as a starting point for the 'monkey-see monkey-do' technique. It will be easy to plug in your own favorite process. In this example we will use AtlFast for the detector simulation and reconstruction. We will produce an AOD that contains the MC truth and reconstructed AtlFast objects. Since the AOD is in pool format we will also transform the AOD into an Ntuple that allows a simple analysis program to be constructed in Root.

Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung Starting with CMT and Athena at NIKHEF

1) Setting up the ATLAS environment at NIKHEF

Some packages are required to get the ATLAS software environment ok. As a first time user you should follow steps a) and b). Every time you log on you only have to process c).

a) Setting up the general ATLAS environment at NIKHEF (first time only)

For a fast start follow the following steps:

  • Login to a SLC3 machine and: source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh

Note: If your directory on the project disk is different from your login name you should tell the setup script. Somebody who's login name is 'Tommie', but wants to do all his ATLAS work under /project/atlas/users/pino should use: source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh opt slc3 pino.

  • Get the TestRelease (with some modifications: check the detailed description)
  1. Go to your project directory: cd /project/atlas/users/<your_login_name>
  2. Check out the TestRelease package from the NIKHEF/ATLAS CVS repository: cvs -d /project/atlas/cvs co TestRelease
  3. Go to the cmt directory: cd TestRelease/TestRelease-00-00-18/cmt
  4. Execute cmt config
  5. Execute source setup.csh

For a detailed description please follow the instructions on: ATLAS setup at NIKHEF.


b) Setting up the Package required to produce Ntuples from the AOD (first time only)

To produce Ntuples from an AOD you'll need to add an additional package created at NIKHEF.

  1. Go to your project directory: cd /project/atlas/users/<your_login_name>
  2. Check out the TTBarAnalysis package from the NIKHEF/ATLAS CVS repository: cvs -d /project/atlas/cvs co TTBarAnalysis
  3. Go to the cmt directory: cd TTBarAnalysis/cmt
  4. Execute cmt config
  5. Execute source setup.csh
  6. Build the library: gmake (Note: you might have to do gmake twice)

You can also get a more detailed set of instructions from Installing the AOD->Ntuple (TTBarabalysis) package.

Once this is set-up you can produce TopNtuples from an AOD if you wish to do so.

c) Setting up all required packages ( every time, but not if you have just done a) and b) )

On every login you should now make sure the shell knows where to get the various programs, which means both the ATLAS general and the Ntuple Make program. You can do this by simply sourcing a script similar to init1002.csh. Simply source it in every window where you want to do the generation: source init1002.csh

Note: Again, ... for those of you whose directory on the project disk is different from your login name you should tell the setup script. Edit the init1002.csh file and add the 3 additional parameters to the line in which the general ATLAS setup script in 'sourced'. Look for example in init1002_special.csh.

2) Generating Higgs events decaying into 4 muons

a) Download the scripts

Go again to your project area and check out the Higgs4MuonAnalysis package from the NIKHEF/ATLAS CVS repository:

cd /project/atlas/users/<your_login_name>
cvs -d /project/atlas/cvs co Higgs4MuonAnalysis
cd Higgs4MuonAnalysis

Let's have a look at what files are in the package.

Athena requires steering files telling it what to do. These files are called joboptions files and since this exercise is made up of 2 steps we have 2 (basic) joboptions files. For there rest we have the script and some extra strange file required by Athena:

  1. jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py joboptions for: Pythia -> AOD:
  2. jobOptions_AOD_to_Ntuple_BASIC.py joboptions for: AOD -> TopNtuple
  3. ShipOff_Pythia.py The script that generates events
  4. PDGTABLE.MeV A steering file required for MC production in Athena (not to be edited)


b) Options in the script

The script takes three arguments:

  1. <Nevents> = The number of events per job
  2. <Njobs> = the number of jobs
  3. <f_interactive> = a flag to signal that you want everything on screen (1) instead of logfile (0, default)

The script is called using: ./ShipOff_Pythia.py <Nevents> <Njobs> <f_interactive>

What does the script do. For each job a subdirectory is made called Jobs<JobNr>. In that directory the joboption files specific to this job are created and Athena is run for both steps. The output files (AOD and TopNtuple) are all stored in that directory.


b) Produce 9 events in 1 job in interactive mode

./ShipOff_Pythia.py 9 1 1

Once the run is finished you can find all input and output files in the sub-directory Job1.

Input files:

./Job1/jobOptions_Pythia_To_Atlfast_To_AOD_Job1.py
./Job1/jobOptions_AOD_to_Ntuple_Job1.py

Output files:

./Job1/AOD.Job1.pool.root
./Job1/TopNtupleV6.Job1.root


c) Produce 1,000 events in 2 jobs of 500 events using LogFiles

./ShipOff_Pythia.py 500 2

Note: You will again put everything in the subDirectory Job1, so if it still exists you will have to rename it or remove it first.

Once the run is finished you can find in the output files in Job1 and Job2 where not only the AOD and TopNtuple are located, but also the LogFiles for the Athena run for both steps.

Finished! You have now produced 1,000 events with .


d) Extra: Choosing a different Physics Process:

The Pythia settings that define the process that is generated is given in the file jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py. If you want to study a different process: simply edit this file and insert your set of pythia parameters.

3) Analysing the content of the Ntuple

To analyse the content of the Ntuple you can either do a MakeClass() yourself or use the Skeleton that was developed at NIKHEF to easily get a handle on the mainobjects and to perform an analysis. It is used in the ATLAS top group and can be found at TopNtuple Analysis Skeleton