Difference between revisions of "Generating Higgs To 4 Muons at NIKHEF"
Line 55: | Line 55: | ||
#<font color=blue> ShipOff_Pythia.py</font> The script that generates events | #<font color=blue> ShipOff_Pythia.py</font> The script that generates events | ||
#<font color=blue> PDGTABLE.MeV</font> A steering file required for MC production in Athena (not to be edited) | #<font color=blue> PDGTABLE.MeV</font> A steering file required for MC production in Athena (not to be edited) | ||
+ | |||
'''b) Options in the script''' | '''b) Options in the script''' | ||
+ | The script takes three arguments: <tt>./ShipOff_Pythia.py <Nevents> <Njobs> <f_interactive> </tt> | ||
+ | |||
+ | # <Nevents> = The number of events per job | ||
+ | # <Njobs> = the number of jobs | ||
+ | # <f_interactive> = a flag to signal that you want everything on screen (1) instead of logfile (0, default) | ||
+ | |||
+ | What doe sthe script do: | ||
+ | For each job a subdirectory is made called Jobs<JobNR>. In that directory the joboptions specific | ||
+ | to this job are created and athena is run for both steps. Finally an Ntuple is produced in that subdirectory. | ||
+ | |||
+ | '''b) Produce 9 events in 1 job in interactive mode''' | ||
+ | |||
+ | :<font color=red> ./ShipOff_Pythia.py 9 1 1 </font> | ||
+ | |||
+ | Once the run is finished you can find in the directory Job1: | ||
+ | |||
+ | :<font color=blue> ./Job1/AOD.pool.root</font> | ||
+ | |||
+ | :<font color=blue> ./Job1/TopNtupleV6.root</font> | ||
+ | |||
+ | |||
+ | '''d) Produce 1,000 events in 2 batches of 500 events using LogFiles''' | ||
− | + | :<font color=red> ./ShipOff_Pythia.py 1000 2 </font> | |
+ | Once the run is finished you can find in the output files in: | ||
− | + | :<font color=blue> ./Job1/AOD.pool.root</font> | |
+ | :<font color=blue> ./Job1/TopNtupleV6.root</font> | ||
+ | :<font color=blue> ./Job2/AOD.pool.root</font> | ||
+ | :<font color=blue> ./Job2/TopNtupleV6.root</font> | ||
Revision as of 13:19, 3 November 2005
An exercise to simulated Higgs production events at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons.
The exercise is ment as a starting point for the 'monkey-see monkey-do' technique. In this example we will use AtlFast for the detector simulation and reconstruction. We will produce an AOD that contains the MC truth and reconstructed AtlFast objects. Since the AOD is in pool format we will also transform the AOD into an Ntuple that allows a simple analysis program to be constructed in Root.
Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung Starting with CMT and Athena at NIKHEF
1) Setting up the ATLAS environment at NIKHEF
a) Setting up the general ATLAS environment at NIKHEF (first time only)
First set up the ATLAS environment at NIKHEF. Follow the instructions on: ATLAS setup at NIKHEF.
b) Setting up the Package required to produce Ntuples from the AOD (first time only)
To produce Ntuples from an AOD you'll need to add an additional package created at NIKHEF.
- Go to the directory where you want to install the package: cd /project/atlas/users/<your_login_name>
- Check out the TTBarAnalysis package from the NIKHEF/ATLAS CVS repository: cvs -d /project/atlas/cvs co TTBarAnalysis
- Go to the cmt directory: cd TTBarAnalysis/cmt
- Execute cmt config
- Execute source setup.csh
- Build the library: gmake
You can also get a more detailed set of instructions from Installing the AOD->Ntuple (TTBarabalysis) package.
Once this is set-up you can produce TopNtuples from an AOD
c) Setting up all required packages (every time)
On every login you should now make sure the shell knows where to get the various programs, which means both the ATLAS general and the Ntuple Make program. You can do this by simply sourcing a script similar to init1002.csh. Simply source it in every window where you want to do the generation: source init1002.csh
2) Generating Higgs events
a) Download the scripts
Go to your favorite area and create a running directory and download the code. At NIKHEF a logical place would be again your project disk:
- cd /project/atlas/users/<your_login_name>
- cvs -d /project/atlas/cvs co Higgs4MuonAnalysis
- cd Higgs4MuonAnalysis
Let's have a look at what files are in the package.
Athena requires steering files telling it what to do. These files are called joboptions files and since this exercise is made up of 2 steps we have 2 (basic) joboptions files. For there rest we have the script and some extra strange file required by Athena:
- jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py joboptions for: Pythia -> AOD:
- jobOptions_AOD_to_Ntuple_BASIC.py joboptions for: AOD -> TopNtuple
- ShipOff_Pythia.py The script that generates events
- PDGTABLE.MeV A steering file required for MC production in Athena (not to be edited)
b) Options in the script
The script takes three arguments: ./ShipOff_Pythia.py <Nevents> <Njobs> <f_interactive>
- <Nevents> = The number of events per job
- <Njobs> = the number of jobs
- <f_interactive> = a flag to signal that you want everything on screen (1) instead of logfile (0, default)
What doe sthe script do: For each job a subdirectory is made called Jobs<JobNR>. In that directory the joboptions specific to this job are created and athena is run for both steps. Finally an Ntuple is produced in that subdirectory.
b) Produce 9 events in 1 job in interactive mode
- ./ShipOff_Pythia.py 9 1 1
Once the run is finished you can find in the directory Job1:
- ./Job1/AOD.pool.root
- ./Job1/TopNtupleV6.root
d) Produce 1,000 events in 2 batches of 500 events using LogFiles
- ./ShipOff_Pythia.py 1000 2
Once the run is finished you can find in the output files in:
- ./Job1/AOD.pool.root
- ./Job1/TopNtupleV6.root
- ./Job2/AOD.pool.root
- ./Job2/TopNtupleV6.root
- Finished!
Finally, chaning the Number of events to 1000 and the number of jobs to 10, you will produce 10,000 events with .