Back to home page

sPhenix code displayed by LXR

 
 

    


Warning, /analysis/HerwigToHepMCProduction/README is written in an unsupported language. File is not indexed.

0001 ########################################## Herwig to HepMC File Generator ###########################################################
0002 
0003 Author: Skaydi
0004 Last Update 13 January 2026 
0005 
0006 
0007 This is a stop gap solution to allow for the generation of HepMC files from a Herwig input.
0008 Temporary fix before "PHHerwig" can be properly implemented
0009 
0010 How it works is generate a .run file, run the makeHerwigJobs script and give it the parameters relevant to the run (run with help option at first for explination of the behavior)
0011 this will generate a number of events broken into a set number of files with an equal number of events per file, triggers for Jets can be optionally employed 
0012 #Directories
0013 The directories are broken down as follows
0014 
0015 ##config_files
0016 
0017 This is where the Herwig input (.in) and run (.run) files live as
0018 
0019 After adding a new ".in" File simply do /cvmfs/sphenix.sdcc.bnl.gov/gcc-12.1.0/opt/sphenix/core/Herwig/bin/Herwig read [.in File name]
0020 
0021 ## HepMCTrigger
0022 
0023 This has the Jet trigger that is implemented on the HepMC output, eventually will add other triggers. 
0024 This is based on anti-kt r=0.4 jets from fastjet
0025 
0026 ## HerwigHepMCFilter
0027 
0028 A specific instance of running the filter that applies a trigger and a set event output per file number
0029 
0030 "Run
0031 
0032 ## HepMC_scripts
0033 
0034 This is where the main usage is. 
0035 
0036 "condor_blank.job" should be updated for the user 
0037 
0038 "condor_file_dir" this is where the condor files go. 
0039 
0040 "Herwig_run.sh" this is what the condor job calls, runs events with a random seed and optionally applies a triger
0041 
0042 "makeHerwigJobs.sh" script to input parameters of the run 
0043 This script runs Herwig to create HepMC files given an input configuration
0044 Options:
0045   
0046  -h, --help             Display this help message
0047  -v, --verbose          Enable verbose job creation (Default false) 
0048  -N, --events           Number of events to generate (Default 1M) 
0049  -n, --perfile          Number of events per file (Default 1k) 
0050  -s, --submit           Make and submit condor jobs (Default false)
0051  -t, --trigger          Input type (MB, Jet10, Jet20, Jet30, PhotonJet5, PhotonJet10, PhotonJet20) (Default MB)
0052  -j, --jetcut           Add a Jet cut filter [Integer GeV] (Default None) 
0053  -p, --photoncut        Add a photon cut filter [Integer GeV] (Default None) 
0054  -i, --input            Specify new input file (Default blank)
0055 
0056 
0057 
0058 ####################################### USAGE ##########################################
0059 
0060 To use and understand this package is relatively staightforwards
0061 
0062 # -----------------STEPS----------------------
0063 ##Step 1
0064 Activate local LHAPATH (needed for Nashville tune)
0065         export LHAPATH=$(pwd)/HerwigToHepMCProduction/config_files:$LHAPATH
0066         export LHAPDF_DATA_PATH=$(pwd)/config_files:$LHAPDF_DATA_PATH
0067 ##Step 2
0068 run the script
0069         cd HepMC_scripts
0070         ./makeHeriwgJobs.sh [options listed above]
0071 ##Step 3 
0072 Interpreting the cross section
0073 For the cross sections, those are output in a text file
0074         "Cross_Section_[TAG].txt"
0075 
0076 ### A note on cross sections
0077 So in order to keep a fixed 1000 good events per DST for reconstruction, the filter requires that we construct far too many events
0078 The cross section is calculated by taking the output cross section from the Herwig generator, and scaling it 
0079  
0080         XS/N = XS_HERWIG / N_Generated * 1000 / N_passing_filter