Difference between revisions of "Generating Higgs To 4 Muons at NIKHEF"
Line 20: | Line 20: | ||
You can install the package in the same place as put your TestRelease. | You can install the package in the same place as put your TestRelease. | ||
− | # Go to the directory where you want to install the package. | + | # Go to the directory where you want to install the package. |
− | # Check out the <tt>TTBarAnalysis</tt> package from the NIKHEF/ATLAS CVS repository: <tt>cvs -d/project/atlas/cvs co | + | cd /project/atlas/users/<your_login_name> </font> |
+ | # Check out the <tt>TTBarAnalysis</tt> package from the NIKHEF/ATLAS CVS repository: <tt>cvs -d/project/atlas/cvs co TTBarAnalysis</tt> | ||
# Go to the cmt directory: <tt>cd TTBarAnalysis/cmt</tt> | # Go to the cmt directory: <tt>cd TTBarAnalysis/cmt</tt> | ||
# Execute <tt>cmt config</tt> | # Execute <tt>cmt config</tt> | ||
Line 27: | Line 28: | ||
# Build the library: <tt>gmake</tt> | # Build the library: <tt>gmake</tt> | ||
− | + | Once this is set-up you should be able to produce Ntuples from an AOD | |
− | Once this is set-up you should | ||
== 3) producing 10 events == | == 3) producing 10 events == |
Revision as of 08:38, 3 November 2005
An exercise to simulated Higgs production events at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons.
The exercise is ment as a starting point for the 'monkey-see monkey-do' technique. In this example we will use AtlFast for the detector simulation and reconstruction. We will produce an AOD that contains the MC truth and reconstructed AtlFast objects. Since the AOD is in pool format we will also transfor the AOD into an Ntuple that allows a simple analysis program to be constructed in Root.
Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung Starting with CMT and Athena at NIKHEF
1) Setting up the ATLAS environment at NIKHEF (general)
First set up the ATLAS environment at NIKHEF. Follow the instructions on: ATLAS setup at NIKHEF.
2) Setting up the Package required to produce Ntuples from the AOD
To produce Ntuples from an AOD you'll need to add an additional package created at NIKHEF.
Full description: Simply follow the instructions for Installing the AOD->Ntuple (TTBarabalysis) package.
Simple description to get started quickly: You can install the package in the same place as put your TestRelease.
- Go to the directory where you want to install the package.
cd /project/atlas/users/<your_login_name>
- Check out the TTBarAnalysis package from the NIKHEF/ATLAS CVS repository: cvs -d/project/atlas/cvs co TTBarAnalysis
- Go to the cmt directory: cd TTBarAnalysis/cmt
- Execute cmt config
- Execute source setup.csh
- Build the library: gmake
Once this is set-up you should be able to produce Ntuples from an AOD
3) producing 10 events
- a) Go to your favorite area and create a running directory
At NIKHEF a logical place would be your project disk:
- cd /project/atlas/users/<your_login_name>
- mkdir MyGeneration
- b) Create your joboptions file
- Each athena job requires a joboptions file as input. Here we will create a joboption file that will:
- Define what 'algorithms to run (in our case Pythia and Atlfast)
- Define the Pythia settings
- Define output parameters/ntuples
- Create a file called joboptions_HiggsGeneration.py. You can download the file from: joboptions_HiggsGeneration.py.
- c) Get additional steering files
- There are some requires input files that you have to get. The way to obbtain them is using the command get_files.
- get_files PDGTABLE.MeV
- get_files PartPropSvc.py
- get_files AtlfastStandardOptions.py
- d) Run Athena
- Running Athena is now a single line: athena.py -bs joboptions_HiggsGeneration.py
- Note: The "-bs" flag is optional and tells athena to print out all commands it is processing:
- e) Check the output
- In the directory now an output file called HiggsNtuple.root has been produced. When you look in the file you'll find the Tree for the CBNT (MC Truth) and that for AtlFast (Event in ATLAS). Now you only have to write a ROOT macro to read the file and do your analysis.
- Finished!
2. producing 10,000 events in 10 sets of 1000
When using the joboptions file in the example above you have produced 10 events, but when you want to have a larger production you want to automatise everything a bit:
Create a new joboption file for each job each having
- A unique random number seed for Pythia
- A user defined number of events
- An output ntuple that is different for each event
- Store output and Logfiles in a separate directory
To do just this a small script has been created.
- a) Create a BASICS-directory
- Create a directory with the basic information you require.
cd MyGeneration/ mkdir HiggsGen_BASICS cp PDGTABLE.MeV ./HiggsGen_BASICS/ cp PartPropSvc.py ./HiggsGen_BASICS/ cp AtlfastStandardOptions.py ./HiggsGen_BASICS/
- We also copy the joboptions file there and rename it
cp joboptions_HiggsGeneration.py ./HiggsGen_BASICS/joboptions_HiggsGeneration.py.BASIC
- b) Edit the standard joboptions file
cd MyGeneration/HiggsGen_BASICS
To allow the script to change the job-dependent settings in the joboptions file we'll now have to change 3 lines in the joboptions file: joboptions_HiggsGeneration.py.BASIC.
- 1) Change Number of events
theApp.EvtMax = 10 . changes to ->
theApp.EvtMax = XNEVENTSX
- 2) Change Random number seeds for Pythia
AtRndmGenSvc.Seeds = ["PYTHIA 5769791 690419913", "PYTHIA_INIT 690501 4106941"]. changes to ->
AtRndmGenSvc.Seeds = ["PYTHIA XRNDPYTHIA0X XRNDPYTHIA1X", "PYTHIA_INIT XRNDPYTHIA2X XRNDPYTHIA3X"]
- 3) Change Name output file (and output directory)
NTupleSvc.Output = [ "FILE1 DATAFILE='./HiggsNtuple.root' OPT='NEW'" ]. changes to ->
NTupleSvc.Output = [ "FILE1 DATAFILE='./XOutputDirCBNTX/HiggsNtuple.JobXJOBNUMBERX.root' OPT='NEW'"]
- c) Create an output directory (Ntuples and Logfiles)
To store the ntuples and Logfiles we create an output directory: cd MyGeneration/ mkdir HiggsGen_OUTPUT mkdir HiggsGen_OUTPUT/InputAndLogfiles
- d) Get the script and tailor it to your needs
- First copy the main script to your running directory
cd /project/atlas/users/<your_login_name>/MyGeneration/ cp /user/ivov/Higgs_Tutorial_Files/ShipOff_HiggsGen.py .
- The main user control flags that need to be edited are listed at the top of the script in the
routine: Get_InputParameters
Nevents_joboptions = 100 # number of events to be generated per job Njobs = 2 # Number of jobs f_LogFile = 0 # Write to screen or logfile steering_files_dir = "/data/atlas/users/<your_login_name>/scratch/MyGeneration/HiggsGen_BASICS/" # basic input files output_dir = "/data/atlas/users/<your_login_name>/scratch/MyGeneration/HiggsGen_OUTPUT/" # output directory
- By default what will happen is that for job 1, a directory called Job1 is produced that contains the files from HiggsGen_OUTPUT/ and a unique joboptions file (50 events with a unique random number sequence for Pythia). In the
directory a link is put to the HiggsGen_BASICS/ directory. After the job is finished the Ntuple called HiggsNtuple.Job1.root is put in that directory. For job number 2 a similar thing happened.
- e) The real thing (logfiles)
- Now, once this is running, you might want to change 2 more things.
- First, you should opt for the automatic logfile:
f_LogFile = 1 # Logfile yes/no
- Then you should remove the # from the line
#CleanUp_Job(i_file)
- This will make that at the end of the job both the logfile and the joboptions file for this job will
be copied to the directory "HiggsGen_OUTPUT/InputAndLogfiles" and that the directory is removed.
Finally, chaning the Number of events to 1000 and the number of jobs to 10, you will produce 10,000 events with .