General Code

Setup Athena

Kit Method

Make Area

- Go to your work area i.e for me I go to, $HOME/ATLAS/testarea

> mkdir 16.0.2 (or whatever code version you are setting up)
> cd 16.0.2
> mkdir cmthome
> cd cmthome


- Now you need to Setup your Requirements file.

> xemacs Requirements

- Then inside this file write:


#--------------------------------------------------------------- set CMTSITE CERN
set SITEROOT /afs/
macro ATLAS_DIST_AREA ${SITEROOT}/atlas/software/dist
macro ATLAS_TEST_AREA ${HOME}/scratch0/
use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA)


#--------------------------------------------------------------------- set CMTSITE STANDALONE
set SITEROOT /scratch1/atlas-sw/16.0.2
macro ATLAS_TEST_AREA /scratch0/$USER/ATLAS/analysis/testarea
apply_tag projectArea
apply_tag opt
apply_tag setup
apply_tag simpleTest
use AtlasLogin AtlasLogin-* $(ATLAS_DIST_AREA)
set CMTCONFIG i686-slc4-gcc34-opt

Setting up & CMT

- Now you want to load in this Requirements file and CMT some packages, so still within cmthome do:

> source /afs/
> cmt config
> source cmthome/ -tag=16.0.2,setup,32
> cd ..
> cmt show versions PhysicsAnalysis/AnalysisCommon/AnalysisExamples
> cmt co -r AnalysisExamples-00-20-39 PhysicsAnalysis/AnalysisCommon/AnalysisExamples
> cd PhysicsAnalysis/AnalysisCommon/AnalysisExamples/cmt
> gmake

- This should then run through and install all the packages ready to run.

Everytime you Start

- Everytime you open a terminal and want to Setup your working area...

> cd $HOME/ATLAS/testarea/16.0.2/cmthome
> source -tag=16.0.2,setup,32

Running Code

- Once setup, you are ready to run some code!

> cd $HOME/ATLAS/testarea/15.6.0/PhysicsAnalysis/AnalysisCommon/AnalysisExamples/run

> >& Example.log &

- Note that some example Joboptions will be in the share folder.

Setting up Nightlies

So you can ls /afs/ (echo $CMTPATH) to find the latest nightlies. For fully flexible and ever changing releases, setup an area as you normal would, but as 15.6.X.Y, or 15.6.X.Y-VAL, etc as appropriate, then;

source -tag=AtlasProduction,15.6.X.Y-VAL,rel_1

For just a cached nightly you can, in your normal area i.e. 15.6.1, simply do:

source -tag=AtlasProduction,,setup,runtime,opt,32

Asetup Method

Asetup allows you to set up a release at your home institute without downloading the release kit each time! (N.B. These instructions are only for linappserv0 on the RHUL server currently).


First you have to source some files and make a testarea as you usually would, along with a directory with your release name.

> export ATLAS_LOCAL_ROOT_BASE=/cvmfs/
> source ${ATLAS_LOCAL_ROOT_BASE}/user/

> mkdir /user/ATLAS/testarea
> mkdir /user/ATLAS/testarea/AtlasOffline-16.6.3

Then you can source your setup like:

> asetup AtlasProduction,16.6.3 --testarea=/home/hayden/ATLAS/testarea/

If you don't know what version you want yet then you can use:

> showVersions --show=athena

To show you the available releases

Every time you Start

Every time you start to get your area working properly you have to do:

> export ATLAS_LOCAL_ROOT_BASE=/opt/atlas/software/manageTier3SW/ATLASLocalRootBase
> source ${ATLAS_LOCAL_ROOT_BASE}/user/
> asetup AtlasProduction,16.6.3 --testarea=/home/hayden/ATLAS/testarea/

Now you are ready to go!


Packages work exactly the same as with the kit version so follow on from there.

CMT Not Working

If you are having problems with CMT, i.e. it asks you for a password from something like , in my case the username it tried to use was my linappserv1 one so no password worked. What you do is;


Where lxplus-username is.... your lxplus username!!!!

Get Athena Job Options

To download Athena jobOptions (from lxr it just gives you a html version!) you set up your athena area and then simply do:

get_files -jo

This should download the file to your current location. Note you can also use the -list option to list but not copy files, enabling you to use wild carding if you are not sure what jobOptions file you want.

Analysis Tips


  • To apply the OTX Cut, in your area on lxplus you need to copy the files checkOQ.C and checkOQ.h from /afs/ along with ALL of the Object Quality Map root files. Copy these files to the Header folder of your analysis.
  • Change the internals of the .C and .h to match your area.
  • In your Analysis .cxx file add:

#include "AnalysisExamples/checkOQ.h"
#include "AnalysisExamples/checkOQ.C"

Within Execute Section:

int OTX = egammaOQ::checkOQCluster(Event_Run_Number,eta1,phi1,1);

Which Returns 3 if failed!

NB Look in the .h and .C files for description of this function and others to decide what you want to use.

Running On The Grid (pAthena)

Setup Grid

source ~/cmthome/ -tag=15.6.9,32,setup

source /afs/


Running on the Grid

Once Setup:

cd $TestArea

pathena --inDS InputDataSetName --outDS user10.myGridName.OutputDataSetName

Now suppose you want to run with multiple datasets on one job, all you have to do is separate your datasets with a comma, i.e.:

--inDS dataset1,dataset2,dataset3...datasetN

Note that even though you take out the "path" part in your job options file, you should still leave the glob with wildcards for /dat* + AOD* etc.

It is also useful to include the two options below...

To do the Build in a tmp space and thus not use up your disk quota: --tmpDir /tmp

To Ensure the Grid has the latest Database Release: --dbRelease ddo.000001.Atlas.Ideal.DBRelease.v100402:DBRelease-10.4.2.tar.gz

If the Grid is complaining the Jobs are too big you can simply add the lines:

--nFilesPerJob 10
--nGBPerJob 10

Lastly, you can choose or exclude a specific grid site, using:


All of these options are just appended to the command line in no particular order.

GoodRunList (GRL)

If you want to run with a good runs list, you need to first download the specific GoodRunsList.xml, and secondly modify your jobOptions file to parse the list, which basically consists of adding in the contents of THIS file, in to your jobOption paying close attention to mainJob and the like, this is the most important part of running with the good run list, but you can also use it to select the datasets you run over, shown below:

pathena --goodRunListXML data10_7TeV.periodA.152166-153200_Zeechannel.xml --outDS user.gridname.LatestDataWork

Some useful Options to add are:

To Search the GoodRunList with a String to syphon Streams:
--goodRunListDS "physics_L1Calo.merge.AOD"

NB You could always make your own GoodRunList with ATLAS run query, searching for what you determine to be "Good" HERE and outputting as a .xml file.

NBB You will still have to follow a part of the Twiki that tells you a few lines to add in your jobOptions so that your code will also only pick up the good luminosity blocks!

Physics Containers (PhysCont)

These are the recommended way to run over data. Within these datasets are place all of the runs needed for a given period, and importantly removing duplicate datasets with different tags. You can dq2-ls the datasets to see different reprocessing versions (always take the latest but check code version, i.e. Rel15 or Rel16, on AMI). For example:


Is the physics container for Period A, in the L1Calo stream with ESD's. All of these can be changed to that which you want, Periods, Streams, Datatype, the lot. The GRL settings in your joboptions will make sure to filter out only the good runs!

Grid Job Monitoring

Then you can check your jobs progress online at (Searching for your job or name):

Retrieving Grid Jobs

To Retrieve your Output:

dq2-get user10.myGridName.OutputDataSetName/

To only download the .root files without the logs you can simply use:

dq2-get -f \*.root\* user10.myGridName.OutputDataSetName/

NB the \ is necessary for the wildcard * for the command line to recognise is. Also note that I put a \* after the root, this is because and retried job will have .1, .2, etc appended, which won't get downloaded if you simply use \*.root.

Cancelling/Retrying Grid Jobs

In your terminal simply type;

> pbook

This will open up an interface from which you can find your job using your job id, i.e. JobID:40 and Cancel the job using the GUI. If instead of the GUI you get a terminal line interface then entering your jobID number between the brackets you can use the commands below:

>>> show()
- Shows your Job Details

>>> retry()
- Retries Failed Subjobs

>>> kill()
- Kills your Job

>>> help()
- Shows you the Help Commands


Apart from the tips mentioned in the grid sections, there are some other gems worth remembering!

  • If subjobs fail on your job, wait for the job to finish, then do pbook, retry(jobID). This will retry all failed subjobs.
  • If a build job fails, causing some "cancelled" subjobs, I wait until the rest of the job has finished, do retry for any normal subjobs that have failed, which leaves you with your failed build job and associated subjobs that cancelled. You then resubmit your dataset to the grid, in the exact same way you originally sent your job, with exactly the same input and output dataset names etc. This will only resubmit the jobs that have failed, and won't create duplicates don't worry!
  • Finally, if a subjob gets stuck on a site in the "prerun" state, or a subjob just keeps simply failing at one site (everytime you retry failed subjobs it sends them to the same site). Either wait for the jobs to fail if they keep failing, or, if they are stuck in prerun, kill(jobID), and then resubmit your job with the same input/output etc as described in the previous point. This should most of the time fix things... if it wants to send it to the same site again even after resubmitting and you really don't want it to, you can use the --excludeSite option to ensure that it doesn't!!!

Running On The Grid (prun)

Most of the move is the same for this setup wise, but if you want to send python or C++ compiled root code to the grid, prun might be more what you're looking for!

C++ Compiled Root Macro

prun --bexe "make clean; make" --exec "./run.exe %IN" --inDS data12Blabla/ --nFilesPerJob=10 --match "*root*" --athenaTag= --outDS user.username.data12_8TeV.Run

Most of this should be self explanatory, by running this command, the file drags all of the things associated with our main .exe onto the grid for running. You need to make your code on the node, which is done by the bexe command (if your making is more complex than "make", you can write a bash script and put this as your bexe which will be run on the node). Any files not dragged by the command to the grid that your code does in fact need (perhaps it does not realise something in your makefile is needed for example), you can add these files to the net being dragged via the extFile command.


prun --exec "python %IN" --inDS data12.blabla/ --nFilesPerJob=10 --match "*root*" --athenaTag= --outDS user.username.outputname --outputs "skim_thin.root"

Grid Certificate!

Request Certificate and Retrieve

First you go through the process described here to request a certificate:

When your request has been accepted, you will receive an e-mail informing you of such, and with a link to your certificate, you must click on the link WITHIN the browser which you requested it, i.e. if you were on Linappserv1, then there. Once you have done this you should make a backup of the certificate, so that you can use it on lxplus too. Note these instructions are for Firefox, instructions for Windows can be found on the certificate website though if needed.

Backup Certificate

1. On Windows, open the "Tools" menu and click "Options". On Linux, open the "Edit" menu and click "Preferences". On Mac, click the 'Firefox' menu and select 'Preferences'.
2. Navigate to the "Advanced" section, the “Encryption” tab and then the "Certificates" area. Click on the "View Certificates" button and then the "Your Certificates" tab.
3. Select the certificate that you wish to backup and click "Export" in Windows, or "Backup..." in Linux.
4. Select where you wish to place the backup and click "OK".
5. You may be asked for your Master Password. This is a Firefox feature that is used to protect all the certificates and passwords your browser stores. Type it in and click "OK".
6. You will be asked for a password for the backup. Make the quality metre reach as high as it can go. Click "OK".
7. Your certificate has been backed up with its private key. Click "OK".

Importing Certificate into another Browser

1. On Windows, open the "Tools" menu and click "Options". On Linux, open the "Edit" menu and click "Preferences". On Mac, open the "Firefox" menu and select "Preferences".
2. Navigate to the "Advanced" section and find the "Certificates" area. Click on the "Manage Certificates" button and then the "Your Certificates" tab.
3. Click the "Import" button.
4. Select the certificate that you wish to restore and click "Open".
5. You may be asked for your Master Password (a Firefox feature that protects all the certificates and passwords stored by your browser). Type it in and click "OK".
6. You should see a message telling you that the certificate was imported successfully. Click "OK".

Getting Certificate Ready for the Grid

> openssl pkcs12 -in CertificateBackup.p12 -clcerts -nokeys -out usercert.pem

It will now ask you for a password, it is the one you used to make the Certificate
> openssl pkcs12 -in CertificateBackup.p12 -nocerts -out userkey.pem

It will now ask you to make, and confirm a password, which is recommended to be at least 16 characters long including capitals and numbers.
All being well you should then move (mv) your CertificateBackup.p12, usercert.pem and userkey.pem into the directory ~/.globus/ remembering to backup any files currently in there just in case something goes wrong, I usually copy the original files (if any) into a file called ~/.globus_old/. Finally you must do:

> chmod 644 usercert.pem

> chmod 400 userkey.pem

To set the correct user rights.


Lxplus Setup

source cmthome/ -tag=15.6.9,setup,32

source /afs/

voms-proxy-init --voms atlas

RHUL Setup

source cmthome/ -tag=15.6.9,setup,32

source /afs/


voms-proxy-init --voms atlas

Useful Commands

To list datasets matching the wildcard:
dq2-ls data10_7TeV*AOD*

To list files within a dataset:
dq2-ls -f datasetname/

To list the sites at which a dataset replica is available:
dq2-ls -r datasetname/

To get datasets you can dq2 the whole thing with:
dq2-get datasetname/

or dq2 individual files with:
dq2-get -f filename datasetname/

This leads to you being able to wildcard datasets for files, which you may for instance want to do to download .root files but not the log files associated with them. This would look like:

dq2-get -f \*.root\* DataSetName/

The \ before the * is needed to make sure the wildcard is interpreted right by dq2.

You can check the online/offline status of a panda site searching THIS Page.

Advanced Commands

If you would like to store a dataset or large files on the grid, you can use dq2-put. You set up your dq2 as normal, but using:

export DQ2_LOCAL_SITE_ID=*Name of Institute Local Site*

Your local institute site where you want to files to go and be accessible on the grid, can be found by using the command:


Once you have entered this you can then use the command:

dq2-put -s Directory_to_be_copied/ Dataset_Name


dq2-put -s /home/mydataset/ user.username.mydataset

This will create a new dataset on your local site with the dataset name specified (has to follow normal conventions), with the directory pointed to contained within it.

Monte Carlo Production Request

Please see my Data Samples Twiki Page for an Up to Date Guide!

MC Production Request Guide

Downloading from Liverpool Grid

ssh linappserv1
cd /scratch0/hayden/mc08/LiverpoolSamples
source /afs/
voms-proxy-init -voms atlas

Then you can do things like:

lcg-ls srm://

If you want to search for files, syphon only the AOD files and download, use the following:

mkdir $DATASET
for SRCFILE in `lcg-ls $SRC | grep AOD`;
lcg-cp srm://$SRCFILE $DSTFILE

The Brute force way of doing this for one file is:

lcg-cp srm:// file:/scratch0/hayden/mc08/LiverpoolSamples/$DATASET/data.30801.reco.AOD.GRAV.v14022003._00392.pool.root.190209.176

NB to check the path is right for the syphon method can put echo in front of lcg-cp


To start up atlantis, go to lxplus, set up your area and then simply use the command:

java -jar /afs/

You can also go to THIS website to download atlantis.jar locally if you so wish. The first part of this guide will talk about getting events from ESD files, but it should be noted that ESD's are being phased out in favour of RAW format which I have found needs a slightly different treatment, so this explanation will come after that of ESD. It is really easy way to get an event you want to see in a format that Atlantis will accept. Take THIS jobOptions file and then run using the command line below.

pathena --eventPickEvtList Events.txt --eventPickDS $DataSet --eventPickDataType ESD --outDS user.$username.$outputname --extOutFile "JiveXML_*.xml"

To run on RAW format I managed to get everything to work with these steps:

  • Set up in release AtlasProduction-

  • Download the package graphics/JiveXML using cmt.
Note these pointers:

  • After having done gmake, you can use the jobOptions in the share directory running with the command such as below:

pathena --eventPickEvtList Events.txt --eventPickDataType RAW --outDS user.yourgridname.AtlantisData11RAW_test --extOutFile "JiveXML*.xml" --eventPickStreamName physics_Egamma

Things to Note:

  • Events.txt should be a file containing a list of run numbers and event numbers you want to capture and turn into .xml files. For example for just one event you would write:

180400 17126264

Where the first is the run number and the second is the event number, and each extra event you want is a new line in the same format.

  • $Dataset should be replaced with the dataset name that you want to find the event in, i.e. if it was a muon event, and you want it from the ESD, search on AMI and use something like:


  • $username and $outputname should be self explanatory to anyone who has used the grid before.

  • The output you get back from the grid will be a .tgz which you untar, revealing a .xml for each event you have asked for which can be plugged straight into Atlantis!!!

  • Sometimes if there is a problem sending a job to the grid along the lines of not being able to find a file, you can try dropping the --eventPickDS, using just the run number / run number + event number in Events.txt. The .py will understand and search for the events regardless.

  • ESD's are beginning to be phased out, so if you are getting errors still mentioning not being able to find the event, then try running with the second method described using RAW.



Setup your area with the latest version, i.e. just source 17.0.0 or whatever you have, then make sure to setup your SVN path correctly, i.e.


Create Directories

To submit an initial directory (you have to be librarian), do:

svn import . $SVNROOT/RHULEx/ -m "initial import"

CheckIn / CheckOut

To Check Out the Package you want do: svn co $SVNROOT/Bla/BlaBla/Folder

for example:

svn co svn+ssh://

Make your changes...

Then to check back in simply do: svn ci -m "some message"

Note to include any new files, before checking back in you should also do: svn add filename, for each new file.

Delete Files

To delete files (be careful), do:

svn delete $SVNROOT/RHULEx/ -m "Correction"


Setup your area etc.... and then make sure you export as below:

export STAGE_SVCCLASS=atlcal

Then you are free to search directories using nsls, like:

nsls /castor/

And Download to your area using xrdcp, like:

xrdcp root://castoratlas//castor/ .

Note that to download you need to include the root://castoratlas// part.
Equally you can copy things to your Castor Area if you have one i.e. for me:

xrdcp LocalFile root://castoratlas//castor/

You can even do this with full directories, or use nsmkdir. To delete files simply use:

nsrm /castor/
nsrmdir /castor/

Geneva Cluster

First of all you need to get access rights to the Geneva Cluster...

Then to setup you log into lxplus as you normally would. From here you do:


source /atlas/software/dpm-test/

export LD_LIBRARY_PATH=/atlas/software/dpm-test/lib:${LD_LIBRARY_PATH}

voms-proxy-init -voms atlas

Then, finally setup your usual athena area.

Now you are all setup you can use rfio, rfdir, rfcp etc.

Faraday Cluster

You must be on linappserv0 to use the RHUL Faraday Cluster. A trivial example of how to use the cluster would be to write a script in bash and then do:

> qsub

To check the status of the job you do:

> qstat JobID

Once the job has finished it will place a two files in the directory you submitted the job from, for example:

> ls *272538

The .o file contains the standard output from your job and .e contains any error log. Below are some useful commands stolen from Simons longer version of this page Here:

  • qsub <script> submit a job (described by the script)
    • qsub -N <name> -q long <script> submit script to long queue (default is short) with the specified name (default is the same as the script)
    • qsub -v VARIABLE=value <script> submit job script to be run with environment variable VARIABLE set to value. A comma-separated list is allowed, e.g. -v VAR1=a,VAR2=b,VAR3=x By default your shell environment at the time of running qsub are not replicated when the script is run.
  • qstat list information about queues and jobs.
    • qstat -Q show status of all queues, including no. of jobs queued and running
    • qstat -q show properties of queues (including time limits)
    • qstat list all jobs in standard format
    • qstat -a list all jobs in alternative format
    • qstat -u $USER list only your own jobs
    • qstat <jobid> show status of job with id %lt;jobid> (can specify multiple job ids)
  • showq list jobs in the order that they are scheduled to run (note this is a maui command and does not have a man page)
  • qdel <jobid> delete job
    • To delete all your jobs, see the end of the advanced section below.

See the man pages for a full list of options. The pbs man page gives an overview of all pbs commands.

Note that these commands can sometimes be quite slow to return when the cluster is very busy.

Dataset Storage and Transfer

You may want to permanently store some datasets that you make, as if they are left on the grid they will eventually be deleted. To do this you can use the Data Transfer Request page here:

Note that you have to go through the process of applying for the correct permissions that I will not go through here, but once you have them it is a simple matter of filling in the fields:

Data Pattern = Name of your Dataset with trailing forward slash.
Cloud Site = Country of Destination, Grid Site (for RHUL I use LOCALGROUPDISK).
Additional E-mails = This will contact you with the status of your request.
Comment = You are required to provide a comment, which is best just to describe the purpose of the dataset.

Next you click "Check" and if the dataset is found then you can proceed to make the transfer. Once you have made the request you will be e-mails to say the request has been received, and again when the transfer has been successful. You can check the status of your data transfer request Here.

Root (General)

Make a Macro from a Displayed Histogram

Save a Histogram or the like, that you have messed around with after plotting, as a .C file, or do C1->SaveAs("Example.C") from the terminal.

Then you can see all of the juicy code that has made your plot so pretty!!!

Merge ROOT Files

Setup your area and then simply do:

hadd -f Result.root Rootfile1.root Rootfile2.root ... RootfileN.root.

Where Result.root is the ROOT file you want to end up with and the others are the files to be merged! Easy! The -f is to force the creation of the Result.root, overwriting any previous instance.

NB If the files you want to merge are numbered consecutively, or are all similar in some way atleast, then all you have to do is use the wildcard option like so (i.e. for the above case): Rootfile*.root

NBB If you have retried any Jobs, then the grid will have appended numbers to the end of your root files such as .root1, .root2, etc. To counteract this with your wildcard option simply give your command like this:

hadd -f Result.root Roofile*.root*

Root (Histograms)

Handling Multiple Histograms

TFile *file1 = TFile::Open("ExampleFile1.root"); TFile *file2 = TFile::Open("ExampleFile2.root");

Then; or to shift focus onto that file,
.pwd() to see where you are within the tree, to show contents of current tree.

I.e. For plotting;
Mee_AOD. SetLineColor(2)

Adding Text to a Histogram

TText *text = new TText (0.0, 0.0, "Write Here")

NB The first two fields are co-ordinates to place the text on the canvas.

Colours : Black = 1, Red = 2, Green = 3, Blue = 4, Yellow = 5, ...The List Goes On!

Also note the Equation Galore section below for how to add Latex text to your Histograms.

Adding A Legend

leg_hist = new TLegend(0.5,0.6,0.9,0.9) leg_hist.SetHeader("Some Histograms") leg_hist.AddEntry(Mee_AOD, "Zprime", "l") leg_hist.Draw("same");

The "l" Draws a Line of the corresponding colour to the variable Mee_AOD.

Handling the Stats Box

Okay so first off, at the top of your file I always put:


To make sure that the Plot is nicely formatted and coloured!

Then if you like, here is how you set it to different degrees, including off!

gStyle->SetOptStat(0); // (OFF)

Also, you can set the default X&Y Position of the Stats Box using:


Now we go onto the more complicated part, where you can have two histograms on one plot, and want both of the Stats Boxes!!! I will also show how to colour them differently which is useful.

To start, and for completeness, the top of my file looks like this in total:

TCanvas *c1 = new TCanvas("c1", "c1",48,91,700,500);

Note the important bit here is to have the stats box turned, and Set the default X position, which will make the extra coding shorter.


You draw all of your Histograms, however you like, and after they are all drawn, remembering to do Draw("sames") for everything after the first Draw() so that you draw the histograms, including their stats boxes on top of eachother.

Then all you have to do is:


TPaveStats * p1 = (TPaveStats *)ElectronPt->GetListOfFunctions()->FindObject("stats");

TPaveStats * p2 = (TPaveStats *)Mee_AOD->GetListOfFunctions()->FindObject("stats");

Where ElectronPt & Mee_AOD are just the two histograms I am drawing, and the SetY1NDC & SetY2NDC set the start and finish positions in y of the Stats Box. It will take a little more fiddling to do this if the histograms are from different files, but it is still entirely possible, you will just have to remember to .cd() through the files appropriately again when it comes to drawing the Stat Box.

Equations Galore! (LaTeX)

In a Root Macro, one can create equations anywhere!

TLatex* tl = new TLatex();
tl->DrawLatex(.7, 0.83, "#alpha #beta #gamma_{1}");

Even in your Legends!

leg_hist.AddEntry(sqrtE, "Expected #alpha_{1}", "l");

Root can interpret LaTeX code using # instead of the usual backslash.

(NB Thanks to Glen Cowan for this tip!!!)

MC Histogram Errors

You can simply add Errors to your Histograms with the Draw Option:


But this is just the sqrt of the bin entry, a more statistically significant way of doing this is to first use another command before you fill your Histogram, something like this:


Data Point (Marker) Style

To Set your Data Point Style couldn't be simpler, on your desired Histogram simply do:


Where # ranges from 1-30, See Here for specific marker styles.

Remove/Offset Labels & Scales

To completely remove the scale on an axis you can use:

Hist1->GetYaxis()->SetLabelOffset(999); Hist1->GetYaxis()->SetTickLength(0);

Note that this also works for the X axis simply usings GetXaxis().

To merely Offset an axis label or title (in the case of axis label just use the above but not 999!):


Where the range is between 0 and 1.

Multiple Canvas And/Or Pads

For Multiple Canvas, simply declare more than one:

TCanvas *c1 = new TCanvas("c1", "c1",48,91,700,500); TCanvas *c2 = new TCanvas("c2", "c2",48,91,700,500);

Using and to move between them.

However on a single canvas, you can still have Multiple Pads, splitting up your original canvas:


For example gives you a 2x2 Canvas, simply using to navigate between Pads, where # ranges in this case from 1-4. Note there can be any number of Pads.

Root (Macros)

Macro with User Input Argument

- So inside your Macro, called Plots.C for example, you need to do something a little different than usual. Instead of just putting your code within the overall {..}, you need to put something before the initial open bracket, so that it looks like this:

using namespace std;

void Plots(double GChoice = 0, double CChoice = 0, double L = 0, double N = 0){...}

- Now within your code you can use these variables anywhere as you would usually, and in the command line when running the Macro:

> root -l
> .x Plots.C(800,0.01,1000,10000)

- This will run the macro with:

GChoice = 800
CChoice = 0.01
L = 1000
N = 10000

- Extremely useful!!!

Enter String/Numbers Anywhere in a Macro!

If you want to add text/numbers somewhere in a Macro, be it for the naming of a histogram, a file that you want to .cd() to, or simply a variable to print in the text of a histogram label, this function is for you. (Note this works for variables parsed from the command line as an argument too!)

In General:

string FirstWord = "Age";
int FirstNumber = 21;

Form("My % is %",FirstWord,FirstNumber)

(Output: My Age is 21)


leg_hist.AddEntry(Signal, Form("Signal (G #rightarrow e^{+}e^{-} (%.0fGeV) @ k/Mpl=%.2f)",GChoice,CChoice), "l");

If GChoice = 300, and CChoice = 0.01, the Output (or what the Macro will see as it were) will be:

leg_hist.AddEntry(Signal, "Signal (G #rightarrow e^{+}e^{-} (300GeV) @ k/Mpl=0.01", "l");

NB: For strings the correct place holder is %s

NBB: Pointed out by Ian Connelly: Sometimes perhaps in use with input to filenames you might have to use the form:

TString input = Hello;
TFile file = Form("%s\.root",input.Data())

Root (D3PD)

If you are doing a ROOT based analysis on ntuples some of these tips may be useful!


To make your basic source and header file from a .root file (enabling you to get started) you can use makeclass. Saying you have a file called Data.root that you want to work on, and that it has a tree called egamma inside it that you want the variables from, in this case you do:

> root Data.root
> .ls (enables you to see what is inside)
> egamma->MakeClass(Analysis)

This will then produce: Analysis.cxx and Analysis.h, where the Analysis.h has all of the variable names stored in the egamma tree ready for you, and Analysis.cxx has a basic loop structure setup within which you can start to write your analysis code!

Linux / C++

Keeping Active!

Use the & symbol at the end of a command such as;

xemacs Sample.cxx &

To open/run a file and still keep the terminal active.

gmake Quickly

If you have already done a gmake to set up all of your files and then you change something small / modify some code simply do;

gmake QUICK=1

This only gmake's files you have modified! If you have any problems, just do gmake again to be safe!

Library Dependencies

Through various debugging, I have found when your code compiles but it seems to be complaining about a library, you can easily check the dependencies for say run.exe by using:

ldd run.exe

Which displays the libraries run.exe uses/needs, or:

readelf -d run.exe

Which will check all of the dependencies of run.exe and its libraries in turn!

Startup Scripting

In your home directory, look for or create .bashrc (If you are using bash that is), and open it with xemacs.
From here you can setup shortcuts and personal key configuring! i.e.

alias PRINT='lpr -P PrinterName'

On Lxplus you can do the same with .bash_login, which is run at lxplus login (again assuming your login environment is bash).
NB, sh = .login, zsh = .zlogin .

Run Macros

If you want to speed things up, for instance, not having to be there when one job finishes to start another, or to dq2 another dataset once the one before it has finished downloading... you can create simple little macro's to do the job for you!!

To run multiple jobs just use THIS in place of your usual joboption, replacing the files with which ones you want to run over. To dq2 multiple datasets, setup dq2 and then just use THIS, it really couldn't be simpler.

The only caveat is that for the .sh file, you need to do;

chmod 777

To grant yourself usage priviledges, and then run using ./

Passing Arguments to C++ Compiled Root Macros

If you compile your Analysis.cxx and Analysis.h with g++ or similar, with a driver file run.cxx maybe so that you compile and execute ./run.exe

You may want to pass arguments at run time (especially useful when passing arguments when submitting to the grid for instance). You can do this, where inside run.cxx you have the main functions like:

int main(int argc, char **argv) { }

argc is the number of arguments being passed to your program, and argv is a vector of these, so that if you do say:

./run.exe 0 1

argc would be 2, and argv would be filled with two values, i.e.:

argv[0] = ./run.exe
argv[1] = 0
argv[2] = 1

So that you can use these arguments inside your code!

Active Terminals (Screen)

If you have ever wanted to be able to open a terminal, set something running, and then close the terminal, turn off your computer etc, but have the action you left running remain alive, then what you need is Screen!!!

All you need do in your current terminal is type:

> Screen

Then after setting what ever you want running, to "detach" your screen press:

Ctrl+A, D.

That is to say you press Ctrl+A, then let go and press D. This will then seemingly take you back to where you started before you typed Screen. Also note that when you type screen you have to setup your area etc as if you had just opened a new terminal. Now to "re-attach" your screen, first you can look at what screens you have running by typing:

> Screen -ls

This will list your active screens, which may appear something like 6063.pts-5.lxplus314. To re-attach your screen simply type the command followed by the screen name you want to re-attach, i.e. in this case:

> Screen -r 6063.pts-5.lxplus314

Finally, to close the Screen permanently, while attached to the screen you merely press:


That's all there is to it!

For advanced options you can edit .screenrc from your home area (this may be a new file for some). If you add the lines for example:

defscrollback 10000
termcapinfo xterm* ti@:te@

This will then set the scroll line buffer to 10000 lines!

Separate from active terminal (without screen) and re-attach in new terminal

1) Start Session.
2) Start Process Running.
3) You realise you should have used "nohup" or "screen".
4) Ctrl-z to sleep the active process.
5) Type "jobs" to see jobID.
6) Type "bg jobID && disown -h jobID" to make the process run in background, and then disown the process from the terminal.
7) Open new terminal.
8) Use "top" to find the processID of your job.
9) Type "cat /proc/processID/fd/1" to view the active output of your job. Note that you will not be able to interact with the job, only view its continual output and result.

NB when you send a job to background using "bg", you can always bring it back to foreground using "fg", but when you disown, there is no way to reown, or adopt.

Editing Multiple Files Recursively

There are two parts to this. Firstly editing the interior of multiple files to make a common replace command. The second is to recursively edit the name of files!

find . -name "*.py" -print | xargs sed -i 's/ee/mumu/g'

Will list all files with *.py in the name, and do a copy replace of all ee, for mumu

rename ee mumu *ee*

Will rename all files with ee in to mumu.

Latex Commands

latex sample.tex (outputs .dvi)
dvipdf sample.dvi (outputs pdf)

However! You can also use (depending on what image types you are using);

pdflatex sample.tex (outputs pdf)

Advanced Copying Files!

Go to wherever the file you want to copy is....

Then do;

scp example.C

where is the server and username of where you want to send your file,
and the information that comes after : is the path you want to send the file to! Easy!

Note you may be asked for your usernames server password, this is usual.

Also! If you want to copy a folder and all of its contents just add the -r options, i.e. scp -r .....

Deleting files, and folders!

rm example.C (Removes File)
rmdir examples.C (Removes Directory)
rm -fr TestArea (Removes the Directory TestArea, and any files/subfolders contained within it)

Be Careful!!! You are not asked if you are sure you want to delete this file like Windows does!

Check Disk Quota

Wherever you are;

fs lq

To find out your Disk Quota!

At RHUL you can do:

df -h

To find out the space on all of the scratch spaces etc.

Printing from the Terminal?

lpr -P printername Example.pdf (Where printername is the name of your desired printer!!!)
lpr -P printername -o Sides=two-sided-long-edge Example.pdf (Same as before except here it forces the printer (if possible) to print on both sides!)

lpq -P printername (Shows you the current printer Queue!)
lprm jobno (Cancels a job! Where jobno is the JobNumber which can be found from the printer Queue)

Cut columns from a file

cat | cut -c5- >

Twiki / HTML

Stop WikiWord Links being made for non-WikiWords!

Simple put an exclamation mark like this ! directly before the falsely designated WikiWord in Question!

LaTeX on Twiki Pages

Start with BEGINLATEX{label="one" color="Green"} surrounded by % symbols, and end with ENDLATEX, also surrounded by % symbols, i.e. %latexbeginbit% ..... %latexendbit%. Then inbetween put what you like in LaTeX format, for example;

\begin{displaymath} \frac{-ik(\xi x + \eta y)}{z} = -2i\pi(\frac{mp}{M} + \frac{nq}{N}) \end{displaymath}

The result will look like this;

  \begin{displaymath} \frac{-ik(\xi x + \eta y)}{z} = -2i\pi(\frac{mp}{M} + \frac{nq}{N}) \end{displaymath} (1)

Images in Twiki

See here for a list of possible image graphics you can simply add to your page.

Excluding Titles from Contents

After the title identifier "---++" place two exclamation marks so it looks like "---++!!". This will then remove that title from the contents list.

Skype Button

See here for how to simply add a Skype Status button to your page, all you do is take the code snippet and paste it in your page code. (NB some combinations such as Large + Button, do not work, i.e. it will always appear offline so check your status!).

Background Colour or Image

Simply use the tags

div style="background-color:#1E90FF"


div style="background-image:url('');"

( NB enclose each line above with < > )

Other Tips

Making Feynman Diagrams

For this I use a program called Jaxodraw. Follow the instructions ahead carefully as this can become complicated if you go off the beaten track.

The files can be found at the main site here: under the download section.

These instructions are for Linux systems with a Java Environment Installed.

Download the Binary Jaxo Draw, and unpack using:

tar -xzf jaxodraw-xxx_bin.tar.gz (where xxx is the version of jaxo draw you just downloaded)

In the SAME directory as the one made from unpacking, i.e. /JaxoDraw-xxx/, download and unpack axodraw4j from here;

Then you can simply run jaxodraw from the main directory using:

java -jar jaxodraw-xxx.jar

With what you have done you can now include latex code in your Feynman diagrams, and save your Diagrams by using the Export function.

Happy Feynman Doodling!



Questions & Comments?

Please feel free to post Questions & Comments below, which I will endeavor to Answer and add to the page.

Edit | Attach | Watch | Print version | History: r107 < r106 < r105 < r104 < r103 | Backlinks | Raw View | Raw edit | More topic actions

Physics WebpagesRHUL WebpagesCampus Connect • Royal Holloway, University of London, Egham, Surrey TW20 0EX; Tel/Fax +44 (0)1784 434455/437520

Topic revision: r107 - 09 Apr 2013 - TomCraneAdmin

This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding RHUL Physics Department TWiki? Send feedback