Difference between revisions of "Generating Higgs To 4 Muons at NIKHEF"

From Atlas Wiki
Jump to navigation Jump to search
Line 38: Line 38:
 
== 2) Generating Higgs events ==
 
== 2) Generating Higgs events ==
  
:'''a) Go to your favorite area and create a running directory'''
+
:'''a) Download the scripts'''
  
At NIKHEF a logical place would be your project disk:
+
Go to your favorite area and create a running directory and download the code. At NIKHEF a logical place  
 +
would be again your project disk:
 
: <font color=red> cd /project/atlas/users/<your_login_name> </font>
 
: <font color=red> cd /project/atlas/users/<your_login_name> </font>
: <font color=red> cvs -d /project/atlas/cvs co Higgs4lAnalysis </font>
+
: <font color=red> cvs -d /project/atlas/cvs co Higgs4MuonAnalysis </font>
:Finished!
+
: <font color=red> cd Higgs4MuonAnalysis </font>
  
== 2. producing 10,000 events in 10 sets of 1000 ==
+
:Let's have a look at what files are in the package.
  
When using the joboptions file in the example above you have produced 10 events, but
+
Athena requires steering files telling it what to do. These files are called joboptions files and since this exercise is made up of 2 steps we need 2 joboptions files:
when you want to have a larger production you want to automatise everything a bit:
+
1) From Pythia -> AOD:        <font color=blue>jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py</font>
 +
2) From AOD    -> TopNtuple: <font color=blue>jobOptions_AOD_to_Ntuple_BASIC.py</font>
  
Create a new joboption file for each job each having
+
Then there is the main running script and some strange file Athena requires for various tasks:
* A unique random number seed for Pythia
 
* A user defined number of events
 
* An output ntuple that is different for each event
 
* Store output and Logfiles in a separate directory
 
  
To do just this a small script has been created.
+
3) ShipOff_Pythia.py The script that generates events
 +
4) PDGTABLE.MeV  A steering file required for MC production in Athena (not to be edited)
  
:'''a) Create a BASICS-directory'''
 
 
: Create a directory with the basic information you require.
 
  <font color=red>
 
  cd MyGeneration/
 
  mkdir HiggsGen_BASICS
 
  cp PDGTABLE.MeV              ./HiggsGen_BASICS/
 
  cp PartPropSvc.py            ./HiggsGen_BASICS/
 
  cp AtlfastStandardOptions.py  ./HiggsGen_BASICS/
 
  </font>
 
:We also copy the joboptions file there and rename it
 
  <font color=red>
 
  cp joboptions_HiggsGeneration.py ./HiggsGen_BASICS/joboptions_HiggsGeneration.py.BASIC
 
  </font>
 
  
:'''b) Edit the standard joboptions file'''
 
  
 +
:Finished!
  
  <font color=red>
 
  cd MyGeneration/HiggsGen_BASICS
 
  </font>
 
 
To allow the script to change the job-dependent settings in the joboptions file we'll now have to change 3 lines in the joboptions file: joboptions_HiggsGeneration.py.BASIC.
 
 
:1) Change Number of events
 
<font color=blue>
 
theApp.EvtMax = 10 </font> .    changes to ->
 
 
<font color=blue>theApp.EvtMax = XNEVENTSX</font>
 
 
:2) Change Random number seeds for Pythia
 
<font color=blue>AtRndmGenSvc.Seeds = ["PYTHIA 5769791 690419913", "PYTHIA_INIT 690501 4106941"]</font>.    changes to ->
 
 
<font color=blue>
 
AtRndmGenSvc.Seeds = ["PYTHIA XRNDPYTHIA0X XRNDPYTHIA1X", "PYTHIA_INIT XRNDPYTHIA2X XRNDPYTHIA3X"]
 
</font>
 
 
:3) Change Name output file (and output directory)
 
<font color=blue>NTupleSvc.Output    = [ "FILE1 DATAFILE='./HiggsNtuple.root' OPT='NEW'" ]</font>.    changes to ->
 
 
<font color=blue>
 
NTupleSvc.Output    = [ "FILE1 DATAFILE='./XOutputDirCBNTX/HiggsNtuple.JobXJOBNUMBERX.root' OPT='NEW'"]
 
</font>
 
 
:'''c) Create an output directory (Ntuples and Logfiles)'''
 
 
  To store the ntuples and Logfiles we create an output directory:
 
  <font color=red>
 
  cd MyGeneration/
 
  mkdir HiggsGen_OUTPUT
 
  mkdir HiggsGen_OUTPUT/InputAndLogfiles
 
  </font>
 
 
:'''d) Get the script and tailor it to your needs'''
 
 
:First copy the main script to your running directory
 
  <font color=red>
 
  cd /project/atlas/users/<your_login_name>/MyGeneration/
 
  cp /user/ivov/Higgs_Tutorial_Files/ShipOff_HiggsGen.py .
 
  </font>
 
 
: The main user control flags that need to be edited are listed at the top of the script in the
 
routine: <font color=blue> Get_InputParameters </font>
 
<font color=blue>
 
<pre>
 
  Nevents_joboptions  =    100  # number of events to be generated per job
 
  Njobs              =      2  # Number of jobs
 
  f_LogFile          =      0  # Write to screen or logfile
 
 
  steering_files_dir  = "/data/atlas/users/<your_login_name>/scratch/MyGeneration/HiggsGen_BASICS/"    # basic input files
 
  output_dir          = "/data/atlas/users/<your_login_name>/scratch/MyGeneration/HiggsGen_OUTPUT/"    # output directory
 
</pre>
 
</font>
 
 
:By default what will happen is that for job 1, a directory called Job1 is produced that contains the files from HiggsGen_OUTPUT/ and a unique joboptions file (50 events with a unique random number sequence for Pythia). In the
 
directory a link is put to the HiggsGen_BASICS/ directory. After the job is finished the Ntuple called
 
HiggsNtuple.Job1.root is put in that directory. For job number 2 a similar thing happened.
 
 
:'''e) The real thing (logfiles)'''
 
 
:Now, once this is running, you might want to change 2 more things.
 
 
:First, you should opt for the automatic logfile:
 
<font color=blue>
 
<pre>
 
f_LogFile          =      1  # Logfile yes/no
 
</pre>
 
</font>
 
  
:Then you should remove the # from the line
 
<font color=blue>
 
<pre>
 
#CleanUp_Job(i_file)
 
</pre>
 
</font>
 
  
:This will make that at the end of the job both the logfile and the joboptions file for this job will
 
be copied to the directory "HiggsGen_OUTPUT/InputAndLogfiles" and that the directory is removed.
 
  
  
 
Finally, chaning the Number of events to 1000 and the number of jobs to 10, you will produce 10,000 events with <math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math>.
 
Finally, chaning the Number of events to 1000 and the number of jobs to 10, you will produce 10,000 events with <math> H \rightarrow ZZ^* \rightarrow \mu^+ \mu^-\mu^+ \mu^-</math>.

Revision as of 12:35, 3 November 2005

An exercise to simulated Higgs production events at the LHC, where the Higgs boson decays into 2 Z bosons that each decay into 2 muons.

The exercise is ment as a starting point for the 'monkey-see monkey-do' technique. In this example we will use AtlFast for the detector simulation and reconstruction. We will produce an AOD that contains the MC truth and reconstructed AtlFast objects. Since the AOD is in pool format we will also transform the AOD into an Ntuple that allows a simple analysis program to be constructed in Root.

Note: We assume you have the CMT and Athena set-up at NIKHEF in ordnung Starting with CMT and Athena at NIKHEF

1) Setting up the ATLAS environment at NIKHEF

a) Setting up the general ATLAS environment at NIKHEF (first time only)

First set up the ATLAS environment at NIKHEF. Follow the instructions on: ATLAS setup at NIKHEF.

b) Setting up the Package required to produce Ntuples from the AOD (first time only)

To produce Ntuples from an AOD you'll need to add an additional package created at NIKHEF.

  1. Go to the directory where you want to install the package: cd /project/atlas/users/<your_login_name>
  2. Check out the TTBarAnalysis package from the NIKHEF/ATLAS CVS repository: cvs -d /project/atlas/cvs co TTBarAnalysis
  3. Go to the cmt directory: cd TTBarAnalysis/cmt
  4. Execute cmt config
  5. Execute source setup.csh
  6. Build the library: gmake

You can also get a more detailed set of instructions from Installing the AOD->Ntuple (TTBarabalysis) package.

Once this is set-up you can produce TopNtuples from an AOD

c) Setting up all required packages (every time)

On every login you should now make sure the shell knows where to get the various programs, which means both the ATLAS general and the Ntuple Make program. You can do this by simply sourcing a script similar to init1002.csh. Simply source it in every window where you want to do the generation: source init1002.csh

2) Generating Higgs events

a) Download the scripts

Go to your favorite area and create a running directory and download the code. At NIKHEF a logical place would be again your project disk:

cd /project/atlas/users/<your_login_name>
cvs -d /project/atlas/cvs co Higgs4MuonAnalysis
cd Higgs4MuonAnalysis
Let's have a look at what files are in the package.

Athena requires steering files telling it what to do. These files are called joboptions files and since this exercise is made up of 2 steps we need 2 joboptions files: 1) From Pythia -> AOD: jobOptions_Pythia_To_Atlfast_To_AOD_BASIC.py 2) From AOD -> TopNtuple: jobOptions_AOD_to_Ntuple_BASIC.py

Then there is the main running script and some strange file Athena requires for various tasks:

3) ShipOff_Pythia.py The script that generates events 4) PDGTABLE.MeV A steering file required for MC production in Athena (not to be edited)


Finished!



Finally, chaning the Number of events to 1000 and the number of jobs to 10, you will produce 10,000 events with .