Difference between revisions of "Generating Higgs To 4 Muons at NIKHEF"
(52 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
+ | An exercise to simulated Higgs production events at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons. | ||
+ | |||
<center> | <center> | ||
<math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math> | <math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math> | ||
</center> | </center> | ||
− | + | The exercise is ment as a starting point for the 'monkey-see monkey-do' technique. It will be easy to plug in your own favorite process. In this example we will use AtlFast for the detector simulation and reconstruction. We will produce an AOD that contains the MC truth and reconstructed AtlFast objects. Since the AOD is in pool format we will also transform the AOD into an Ntuple that allows a simple analysis program to be constructed in Root. | |
− | |||
Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung [[CMT_and_Athena_at_NIKHEF | Starting with CMT and Athena at NIKHEF]] | Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung [[CMT_and_Athena_at_NIKHEF | Starting with CMT and Athena at NIKHEF]] | ||
− | == | + | == 1) Setting up the ATLAS environment at NIKHEF == |
+ | Some packages are required to get the ATLAS software environment ok. As a first time user you should follow steps a) and b). Every time you log on you only have to process c). | ||
− | + | <b>a) Setting up the general ATLAS environment at NIKHEF (first time only)</b> | |
− | : | + | For a fast start follow the following steps: |
− | : < | + | *Login to a SLC3 machine and: <tt>source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh</tt> |
− | |||
− | : | + | <font color="red"><b>Note:</b></font> If your directory on the project disk is different from your login name you should tell the setup script. Somebody who's login name is 'Tommie', but wants to do all his ATLAS work under /project/atlas/users/pino should use: <tt>source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh opt slc3 pino</tt>. |
− | : | + | * Get the TestRelease (with some modifications: check the detailed description) |
+ | # Go to your project directory: <tt> cd /project/atlas/users/<your_login_name> </tt> | ||
+ | # Check out the <tt>TestRelease</tt> package from the NIKHEF/ATLAS CVS repository: <tt>cvs -d /project/atlas/cvs co TestRelease</tt> | ||
+ | # Go to the cmt directory: <tt>cd TestRelease/TestRelease-00-00-18/cmt</tt> | ||
+ | # Execute <tt>cmt config</tt> | ||
+ | # Execute <tt>source setup.csh</tt> | ||
− | : | + | For a detailed description please follow the instructions on: [http://www.nikhef.nl/pub/experiments/atlaswiki/index.php/Atlas_1002_setup ATLAS setup at NIKHEF]. |
− | : | ||
− | |||
− | |||
− | < | + | <b>b) Setting up the Package required to produce Ntuples from the AOD (first time only)</b> |
− | < | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | To produce Ntuples from an AOD you'll need to add an additional package created at NIKHEF. | |
− | |||
− | |||
− | |||
− | + | # Go to your project directory: <tt> cd /project/atlas/users/<your_login_name> </tt> | |
+ | # Check out the <tt>TTBarAnalysis</tt> package from the NIKHEF/ATLAS CVS repository: <tt>cvs -d /project/atlas/cvs co TTBarAnalysis</tt> | ||
+ | # Go to the cmt directory: <tt>cd TTBarAnalysis/cmt</tt> | ||
+ | # Execute <tt>cmt config</tt> | ||
+ | # Execute <tt>source setup.csh</tt> | ||
+ | # Build the library: <tt>gmake</tt> (Note: you might have to do gmake twice) | ||
− | + | You can also get a more detailed set of instructions from [http://www.nikhef.nl/pub/experiments/atlaswiki/index.php/Running_ttbar_package Installing the AOD->Ntuple (TTBarabalysis) package]. | |
− | |||
− | + | Once this is set-up you can produce TopNtuples from an AOD if you wish to do so. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | <b>c) Setting up all required packages ( every time, but not if you have just done a) and b) )</b> | |
− | |||
− | |||
− | |||
− | + | On every login you should now make sure the shell knows where to get the various programs, | |
− | + | which means both the ATLAS general and the Ntuple Make program. You can do this by simply sourcing a script similar to [http://www.nikhef.nl/~ivov/init1002.csh init1002.csh]. | |
− | + | Simply source it in every window where you want to do the generation: <tt> source init1002.csh </tt> | |
− | |||
− | |||
− | |||
− | + | <font color="red"><b>Note:</b></font> Again, ... for those of you whose directory on the project disk is different from your login name you should tell the setup script. Edit the init1002.csh file and add the 3 additional parameters to the line in which the general ATLAS setup script in 'sourced'. Look for example in [http://www.nikhef.nl/~ivov/init1002_special.csh init1002_special.csh]. | |
− | |||
− | + | == 2) Generating Higgs events decaying into 4 muons == | |
− | |||
− | + | '''a) Download the scripts''' | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | Go again to your project area and check out the Higgs4MuonAnalysis package from the NIKHEF/ATLAS CVS repository: | |
− | + | : <font color=red> cd /project/atlas/users/<your_login_name> </font> | |
− | + | : <font color=red> cvs -d /project/atlas/cvs co Higgs4MuonAnalysis </font> | |
− | + | : <font color=red> cd Higgs4MuonAnalysis </font> | |
− | + | Let's have a look at what files are in the package. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | : | ||
− | |||
− | : | ||
− | |||
− | |||
− | :<font color=red> | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | Athena requires steering files telling it what to do. These files are called joboptions files and | |
+ | since this exercise is made up of 2 steps we have 2 (basic) joboptions files. For there rest we have the script and some extra strange file required by Athena: | ||
− | : | + | #<font color=blue>jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py</font> joboptions for: Pythia -> AOD: |
− | + | #<font color=blue>jobOptions_AOD_to_Ntuple_BASIC.py</font> joboptions for: AOD -> TopNtuple | |
− | : | + | #<font color=blue> ShipOff_Pythia.py</font> The script that generates events |
− | + | #<font color=blue> PDGTABLE.MeV</font> A steering file required for MC production in Athena (not to be edited) | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | '''b) Options in the script''' | |
− | : | + | The script takes three arguments: |
− | |||
− | |||
− | < | + | # <Nevents> = The number of events per job |
+ | # <Njobs> = the number of jobs | ||
+ | # <f_interactive> = a flag to signal that you want everything on screen (1) instead of logfile (0, default) | ||
− | : | + | The script is called using: <tt>./ShipOff_Pythia.py <Nevents> <Njobs> <f_interactive> </tt> |
− | < | ||
− | < | + | What does the script do. For each job a subdirectory is made called Jobs<JobNr>. In that directory |
− | + | the joboption files specific to this job are created and Athena is run for both steps. The output files | |
− | + | (AOD and TopNtuple) are all stored in that directory. | |
− | |||
− | |||
− | + | '''b) Produce 9 events in 1 job in interactive mode''' | |
− | |||
− | |||
− | : | + | :<font color=red> ./ShipOff_Pythia.py 9 1 1 </font> |
− | + | Once the run is finished you can find all input and output files | |
− | + | in the sub-directory Job1. | |
− | |||
− | |||
− | |||
− | |||
− | : | + | Input files: |
+ | :./Job1/<font color=blue>jobOptions_Pythia_To_Atlfast_To_AOD_Job1.py</font> | ||
+ | :./Job1/<font color=blue>jobOptions_AOD_to_Ntuple_Job1.py</font> | ||
− | : | + | Output files: |
+ | :./Job1/<font color=blue>AOD.Job1.pool.root</font> | ||
+ | :./Job1/<font color=blue>TopNtupleV6.Job1.root</font> | ||
− | |||
− | |||
− | |||
− | |||
− | + | '''c) Produce 1,000 events in 2 jobs of 500 events using LogFiles''' | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | : | + | :<font color=red> ./ShipOff_Pythia.py 500 2 </font> |
− | : | + | <b>Note:</b> You will again put everything in the subDirectory Job1, so if it still exists you will have to rename it or remove it first. |
− | + | Once the run is finished you can find in the output files in Job1 and Job2 where not only the AOD and TopNtuple are located, but also the LogFiles for the Athena run for both steps. | |
− | + | Finished! You have now produced 1,000 events with <math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math>. | |
− | |||
− | |||
− | |||
− | < | ||
− | </ | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | : | + | '''d) Extra: Choosing a different Physics Process:''' |
+ | The Pythia settings that define the process that is generated is given in the file jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py. If you want to study a different process: simply edit this file and insert your set of pythia parameters. | ||
− | + | == 3) Analysing the content of the Ntuple == | |
+ | To analyse the content of the Ntuple you can either do a MakeClass() yourself or use the | ||
+ | Skeleton that was developed at NIKHEF to easily get a handle on the mainobjects and to | ||
+ | perform an analysis. It is used in the ATLAS top group and can be found at | ||
+ | [http://www.nikhef.nl/pub/experiments/atlaswiki/index.php/Ttbar_analysis_skeleton TopNtuple Analysis Skeleton] |
Latest revision as of 15:51, 21 February 2006
An exercise to simulated Higgs production events at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons.
The exercise is ment as a starting point for the 'monkey-see monkey-do' technique. It will be easy to plug in your own favorite process. In this example we will use AtlFast for the detector simulation and reconstruction. We will produce an AOD that contains the MC truth and reconstructed AtlFast objects. Since the AOD is in pool format we will also transform the AOD into an Ntuple that allows a simple analysis program to be constructed in Root.
Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung Starting with CMT and Athena at NIKHEF
1) Setting up the ATLAS environment at NIKHEF
Some packages are required to get the ATLAS software environment ok. As a first time user you should follow steps a) and b). Every time you log on you only have to process c).
a) Setting up the general ATLAS environment at NIKHEF (first time only)
For a fast start follow the following steps:
- Login to a SLC3 machine and: source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh
Note: If your directory on the project disk is different from your login name you should tell the setup script. Somebody who's login name is 'Tommie', but wants to do all his ATLAS work under /project/atlas/users/pino should use: source /project/atlas/nikhef/setup/nikhef_setup_10.0.2.csh opt slc3 pino.
- Get the TestRelease (with some modifications: check the detailed description)
- Go to your project directory: cd /project/atlas/users/<your_login_name>
- Check out the TestRelease package from the NIKHEF/ATLAS CVS repository: cvs -d /project/atlas/cvs co TestRelease
- Go to the cmt directory: cd TestRelease/TestRelease-00-00-18/cmt
- Execute cmt config
- Execute source setup.csh
For a detailed description please follow the instructions on: ATLAS setup at NIKHEF.
b) Setting up the Package required to produce Ntuples from the AOD (first time only)
To produce Ntuples from an AOD you'll need to add an additional package created at NIKHEF.
- Go to your project directory: cd /project/atlas/users/<your_login_name>
- Check out the TTBarAnalysis package from the NIKHEF/ATLAS CVS repository: cvs -d /project/atlas/cvs co TTBarAnalysis
- Go to the cmt directory: cd TTBarAnalysis/cmt
- Execute cmt config
- Execute source setup.csh
- Build the library: gmake (Note: you might have to do gmake twice)
You can also get a more detailed set of instructions from Installing the AOD->Ntuple (TTBarabalysis) package.
Once this is set-up you can produce TopNtuples from an AOD if you wish to do so.
c) Setting up all required packages ( every time, but not if you have just done a) and b) )
On every login you should now make sure the shell knows where to get the various programs, which means both the ATLAS general and the Ntuple Make program. You can do this by simply sourcing a script similar to init1002.csh. Simply source it in every window where you want to do the generation: source init1002.csh
Note: Again, ... for those of you whose directory on the project disk is different from your login name you should tell the setup script. Edit the init1002.csh file and add the 3 additional parameters to the line in which the general ATLAS setup script in 'sourced'. Look for example in init1002_special.csh.
2) Generating Higgs events decaying into 4 muons
a) Download the scripts
Go again to your project area and check out the Higgs4MuonAnalysis package from the NIKHEF/ATLAS CVS repository:
- cd /project/atlas/users/<your_login_name>
- cvs -d /project/atlas/cvs co Higgs4MuonAnalysis
- cd Higgs4MuonAnalysis
Let's have a look at what files are in the package.
Athena requires steering files telling it what to do. These files are called joboptions files and since this exercise is made up of 2 steps we have 2 (basic) joboptions files. For there rest we have the script and some extra strange file required by Athena:
- jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py joboptions for: Pythia -> AOD:
- jobOptions_AOD_to_Ntuple_BASIC.py joboptions for: AOD -> TopNtuple
- ShipOff_Pythia.py The script that generates events
- PDGTABLE.MeV A steering file required for MC production in Athena (not to be edited)
b) Options in the script
The script takes three arguments:
- <Nevents> = The number of events per job
- <Njobs> = the number of jobs
- <f_interactive> = a flag to signal that you want everything on screen (1) instead of logfile (0, default)
The script is called using: ./ShipOff_Pythia.py <Nevents> <Njobs> <f_interactive>
What does the script do. For each job a subdirectory is made called Jobs<JobNr>. In that directory the joboption files specific to this job are created and Athena is run for both steps. The output files (AOD and TopNtuple) are all stored in that directory.
b) Produce 9 events in 1 job in interactive mode
- ./ShipOff_Pythia.py 9 1 1
Once the run is finished you can find all input and output files in the sub-directory Job1.
Input files:
- ./Job1/jobOptions_Pythia_To_Atlfast_To_AOD_Job1.py
- ./Job1/jobOptions_AOD_to_Ntuple_Job1.py
Output files:
- ./Job1/AOD.Job1.pool.root
- ./Job1/TopNtupleV6.Job1.root
c) Produce 1,000 events in 2 jobs of 500 events using LogFiles
- ./ShipOff_Pythia.py 500 2
Note: You will again put everything in the subDirectory Job1, so if it still exists you will have to rename it or remove it first.
Once the run is finished you can find in the output files in Job1 and Job2 where not only the AOD and TopNtuple are located, but also the LogFiles for the Athena run for both steps.
Finished! You have now produced 1,000 events with .
d) Extra: Choosing a different Physics Process:
The Pythia settings that define the process that is generated is given in the file jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py. If you want to study a different process: simply edit this file and insert your set of pythia parameters.
3) Analysing the content of the Ntuple
To analyse the content of the Ntuple you can either do a MakeClass() yourself or use the Skeleton that was developed at NIKHEF to easily get a handle on the mainobjects and to perform an analysis. It is used in the ATLAS top group and can be found at TopNtuple Analysis Skeleton