Warning, /analysis/JS-Jet/FastJetMedianBkg/src_JetMedianTree/readme.md is written in an unsupported language. File is not indexed.
0001 To use this code:
0002
0003 1. Make a build directory and compile the code
0004 ```
0005 $ mkdir build
0006 $ cd build
0007 $ ../autogen.sh [install_lib_location]
0008 $ make -j4 install
0009 $ cd ..
0010 ```
0011 Steps 2-4 make the JetReco tree
0012 2. Get the input files:
0013 ```
0014 $ cd macro
0015 $ mkdir full_lists
0016 $ cd full_lists
0017 $ CreateFileList.pl -type 11 -embed -n 100000000 DST_BBC_G4HIT DST_CALO_CLUSTER DST_TRUTH_JET DST_TRUTH_G4HIT DST_GLOBAL
0018 $ cd ..
0019 ```
0020 3. Make an output directory to test the files, and modify the running scripts
0021 ```
0022 $ chmod a+x make_inp_lists.py test.sh condor.sh
0023 [modify condor.sh to work with your file install location; condor.sh will be called by test.sh and condor_r07.job]
0024 [modify test.sh, probably to run just a very few events]
0025 $ ./make_inp_lists.py 2 1
0026 $ cd jobs_2x1
0027 $ ../test.sh
0028 [inspect the outputs of the generated *.root test file]
0029 $ cd ..
0030 ```
0031 4. When you are happy with step 3., make a directory to run over all the input file lists. If, for instance, you want jobs
0032 that will run 50 input files per job, with however many jobs it takes to do them all then:
0033 ```
0034 $ ./make_inp_lists.py 50 -1
0035 [or, in this case, just `$ ./make_inp_lists.py 50` works fine, too]
0036 $ cd jobs_50xAll
0037 $ condor_submit ../condor_r04.job
0038 ```
0039 5. hadd together the output and process it
0040 ```
0041 $ cd plot
0042 $ hadd -r -k hadd_jobs_50xAll.root ../jobs_50xAll/*.root
0043 $ ln -s hadd_jobs_50xAll.root hadd_Sub1rhoA.root
0044 $ root -l doJES_Sub1rhoA.C
0045 [your output file doJES_Sub1rhoA.root contains the desired output data]
0046
0047 ```
0048 Steps 6-9 make the rho fluctuation tree
0049 6. Get the files for the HIJING run
0050 ```
0051 $ mkdir rhofluct_lists
0052 $ cd rhofluct_lists
0053 $ CreateFileList.pl -type 4 -n 1000000 DST_BBC_G4HIT DST_CALO_CLUSTER DST_GLOBAL
0054 $ cd ..
0055 ```
0056 7. Make a small test run:
0057 ```
0058 $ ./make_inp_lists.py 1 1 rhofluct_lists
0059 $ chmod a+x rhofluct_test.sh rhofluct_condor.sh Fun4All_RhoFluct.C
0060 [modify rhofluct_test.sh rhofluct_condor.sh and Fun4All_RhoFluct.C to use
0061 your proper install directory and probably in the test file to fun only a few events]
0062 $ cd rhofluct_1x1
0063 $ ../rhofluct_test.sh
0064 [if the test file works well, you should also test out condor on this small sampple
0065 before running a full set]
0066 $ condor_submit ../condor_rhofluct.job
0067 ```
0068 8. After the tests in step 6. work well, run the full statistics
0069 ```
0070 $ ./make_inp_lists.py 250 -1 rhofluct_lists
0071 $ cd rhofluct_250xAll
0072 $ condor_submit ../condor_rhofluct.job
0073 ```
0074 9. hadd together the output and process it
0075 ```
0076 $ cd plot
0077 $ hadd -r -k hadd_rhofluct_260xAll.root ../rhofluct_250xAll/*.root
0078 $ ln -s hadd_rhofluct_260xAll.root hadd_RhoFluct.root
0079 $ root -l dodecile_RhoFluct.C
0080 [your output file dodecile_RhoFluct.root contains the desired data]
0081 ```
0082