Difference between revisions of "CSC Ntuple production"
Jump to navigation
Jump to search
m (→GridModules) |
m (→GridModules) |
||
Line 48: | Line 48: | ||
;AtlasRelease | ;AtlasRelease | ||
− | :To use the Athena software on the the Grid, the desired release has to be set up. It actually contains only one file: <tt>setup.sh</tt> and it's similar to the setup script you need to do when setting up Athena locally. (A part of this script is actually to setup Athena locally as you can also run/test jobs locally). A few changes have taken place since release 12, my 'AtlasRelease12.0.6/setup.sh' | + | :To use the Athena software on the the Grid, the desired release has to be set up. It actually contains only one file: <tt>setup.sh</tt> and it's similar to the setup script you need to do when setting up Athena locally. (A part of this script is actually to setup Athena locally as you can also run/test jobs locally). A few changes have taken place since release 12, my 'AtlasRelease12.0.6/setup.sh': |
<pre> | <pre> | ||
#!/bin/sh | #!/bin/sh | ||
Line 86: | Line 86: | ||
fi | fi | ||
</pre> | </pre> | ||
+ | :<i>Note that release 12.0.7 is the final release of the 12 series (but it is not installed on all Grid machines yet)</i> | ||
;TopView module | ;TopView module | ||
− | http://atlas-computing.web.cern.ch/atlas-computing/links/kitsDirectory/PAT/EventView/ | + | :Then it's time for an 'Analysis' package (OK, dumping AODs in ntuples is not really 'analysis', though eh... what's the difference?). For CSC ntuple production here at Nikhef, the [http://www.nikhef.nl/~gossie/TopViewAODtoNtuple-00-12-13-03.tgz TopViewAODtoNtuple-00-12-13-03] package can be used. |
− | < | + | <ul>The steps involved to create the package (for reference): |
− | wget http://atlas-computing.web.cern.ch/atlas-computing/links/kitsDirectory/PAT/EventView/EventView-12.0.6.8.tar.gz | + | <li> check latest version of EventView group area: http://atlas-computing.web.cern.ch/atlas-computing/links/kitsDirectory/PAT/EventView/ |
− | < | + | <li> wget http://atlas-computing.web.cern.ch/atlas-computing/links/kitsDirectory/PAT/EventView/EventView-12.0.6.8.tar.gz |
− | strip unnescessary files | + | <li> strip unnescessary files/directories (only <tt>InstallArea</tt> and <tt>PhysicsAnalysis</tt> needed. <br> |
− | InstallArea | + | Little complication: if incorrect TopView version is in the package. Compile TopView libraries locally and copy them to <tt>InstallArea</tt>. |
− | Little complication: if incorrect version is in package. Compile libraries locally and copy them to InstallArea. | + | <li> <tt>tar -cvzf EventView-12.0.6.8_nikhef.tar.gz EVTags-12.0.6.8/</tt> and copy to the <tt>TopViewAODtoNtuple-00-12-13-03</tt> module directory. |
− | tar cvzf EventView-12.0.6.8_nikhef.tar.gz EVTags-12.0.6.8/ | + | <li> check run scripts in the TopViewAOD module (adjust version numbers!) |
− | check run scripts in module (adjust version numbers) | + | <li> check <tt>LocalOverride_Nikhef_BASIC.py</tt> |
− | make_package TopViewAODtoNtuple-00-12-13-03 | + | <li> <tt>make_package TopViewAODtoNtuple-00-12-13-03</tt> |
+ | </ul> | ||
==Ganja== | ==Ganja== |
Revision as of 11:04, 6 July 2007
Setting up Grid tools
This wiki describes how to set things up for ntuple production on the Grid. You need a Grid certificate [1] and some patience ;-)
- Two tips:
- use voms-proxy-init instead of grid-proxy-init
- use bash for shell scripts
GridTools
Wouter wrote some very nice tools to submit/retrieve jobs from the Grid. They can be obtained from the Nikhef CVS:
cvs -d /project/atlas/cvs co GridTools
The GridTools package contains a few shell scripts:
- dmgr
- to manage datasets on the Grid
- gpm
- to set up packages on the SE for Grid jobs (to prevent jobs sizes > input sandbox)
- gridmgr
- main tool to manage Grid jobs (submit/retrieve/cancel/monitor)
- gridinfo
- a funny tool (lcg-infosites seems more useful)
- jobmgr
- to define Grid jobs (with the possibility to run them locally for testing)
dmgr and gpm possibly need to be adjusted:
LFC_HOST = lfc03.nikhef.nl GRID_SE = tbn18.nikhef.nl
GridModules
GridModules are packages which will be installed in /grid/atlas/users/${USER}/gpm. These packages reside on the SE (tbn18.nikhef.nl) and can be used for jobs. This prevents submitting (too) large jobs, note that there is a limit on the input sandbox of ~ 20-50 MB.
GridModules are available from CVS:
cvs -d /project/atlas/cvs co GridModules
A few examples of modules are present (interesting study material).
- make_package
- With this tool, a directory is tarred and stored on the SE (using gpm from the GridTools). Note that the file ~/.gpmrc keeps track of which modules have been installed. When running jobs locally, it will use the locally available package instead of the one installed on the SE. Be careful not to include the slash '/' when making the package!
- DataManager
- This package is used to copy/move/delete and define datasets on the SE. You will definitely need this package. But before doing a make_package DataManager edit DataManager/run to set LFC_HOST and GRID_SE to the desired addresses.
- AtlasRelease
- To use the Athena software on the the Grid, the desired release has to be set up. It actually contains only one file: setup.sh and it's similar to the setup script you need to do when setting up Athena locally. (A part of this script is actually to setup Athena locally as you can also run/test jobs locally). A few changes have taken place since release 12, my 'AtlasRelease12.0.6/setup.sh':
#!/bin/sh # # Setup the environment for ATLAS release 12.0.6 # # gossie@nikhef.nl # LOCAL_AREA=/data/atlas/offline/12.0.6 # --- Clear command line to avoid CMT confusion --- set - if [ "$VO_ATLAS_SW_DIR" != "" ] ; then # --- Follow the GRID approach --- echo "Setting up ATLAS release 12.0.6 from VO_ATLAS_SW_DIR=$VO_ATLAS_SW_DIR" . $VO_ATLAS_SW_DIR/software/12.0.6/setup.sh . ${SITEROOT}/AtlasOffline/12.0.6/AtlasOfflineRunTime/cmt/setup.sh CMTPATH="${PWD}:${CMTPATH}" elif [ -d $LOCAL_AREA ] ; then # --- Follow the local approach --- echo "Setting up ATLAS release 12.0.6 from LOCAL_AREA=$LOCAL_AREA" . $LOCAL_AREA/setup.sh . ${SITEROOT}/AtlasOffline/12.0.6/AtlasOfflineRunTime/cmt/setup.sh CMTPATH="${PWD}:${CMTPATH}" else # --- ERROR: Don't know where release is! echo "ERROR setting up ATLAS release 12.0.6, cannot find release" echo "Release_12.0.6_Not_Found" > errorcode fi
- Note that release 12.0.7 is the final release of the 12 series (but it is not installed on all Grid machines yet)
- TopView module
- Then it's time for an 'Analysis' package (OK, dumping AODs in ntuples is not really 'analysis', though eh... what's the difference?). For CSC ntuple production here at Nikhef, the TopViewAODtoNtuple-00-12-13-03 package can be used.
- The steps involved to create the package (for reference):
- check latest version of EventView group area: http://atlas-computing.web.cern.ch/atlas-computing/links/kitsDirectory/PAT/EventView/
- wget http://atlas-computing.web.cern.ch/atlas-computing/links/kitsDirectory/PAT/EventView/EventView-12.0.6.8.tar.gz
- strip unnescessary files/directories (only InstallArea and PhysicsAnalysis needed.
Little complication: if incorrect TopView version is in the package. Compile TopView libraries locally and copy them to InstallArea. - tar -cvzf EventView-12.0.6.8_nikhef.tar.gz EVTags-12.0.6.8/ and copy to the TopViewAODtoNtuple-00-12-13-03 module directory.
- check run scripts in the TopViewAOD module (adjust version numbers!)
- check LocalOverride_Nikhef_BASIC.py
- make_package TopViewAODtoNtuple-00-12-13-03