Back to home page

sPhenix code displayed by LXR

 
 

    


Warning, /coresoftware/calibrations/tpc/fillSpaceChargeMaps/README.md is written in an unsupported language. File is not indexed.

0001 # SetUp
0002 Start here for setting up environment: [Example_of_using_DST_nodes (Wiki)](https://wiki.bnl.gov/sPHENIX/index.php/Example_of_using_DST_nodes#Building%20a%20package)
0003 - Setup local compilation for bash shel:
0004 
0005 ```
0006 source /opt/sphenix/core/bin/sphenix_setup.sh -n
0007 export MYINSTALL=/sphenix/user/shulga/tpc2019_install
0008 source /opt/sphenix/core/bin/setup_local.sh $MYINSTALL
0009 ```
0010 **Do not forget to change the line:** ```export MYINSTALL=/sphenix/user/shulga/tpc2019_install```
0011 
0012 <!---
0013 - Creating package files:
0014 ```
0015 CreateSubsysRecoModule.pl --all --overwrite fillSpaceChargeMaps
0016 ``` 
0017 (*very useful script providing all files needed for new package*, `be carefull with options`)
0018 --->
0019 
0020 - Compilation of the package:
0021 ```
0022 mkdir build
0023 cd build
0024 ../autogen.sh --prefix=$MYINSTALL
0025 make -j 4
0026 make install
0027 ```
0028 - reading first file:
0029 
0030 
0031 
0032 
0033 # Workflow:
0034 - Files with G4Hits from Hijing events used for the analysis: 
0035     - G4Hits_sHijing_0_20fm-0000000002-*
0036 
0037 - File with  bunchcrossing id and time (ns) assuming 106ns between bunches and 50kHz collision rate: __timestamps_50kHz.txt__. The file is used to mimic the bunchcrossing;
0038 
0039 - Running over G4Hits containers in the files is performed with Fun4All environment. Main code is Fun4All_FillChargesMap_300evts*.C, it is run with run_files.sh, which takes as an input the first and last file number:
0040 ```
0041 #This will run first 5 files with G4Hits (100 events per file in the MDC2) and create 5 files 
0042 #with histograms:
0043 source macros/run_files_300evts_AA_MDC2.sh 0 5 
0044 ```
0045 
0046 -  As soon as files are available the histograms are inside;
0047 - To create bunch of bash files and condor job files to start condor jobs scripts are available:
0048 ```
0049 #Creating folders to store all the files:
0050 
0051 mkdir Out
0052 mkdir Files
0053 mkdir condor_macros
0054 
0055 #Creating 1000s job files to run over G4Hits:
0056 scripts/generate_run_files_300evts_AA_MDC2.py
0057 ```
0058 **Do not forget to change the path to your repositories:** 
0059 
0060 ```export MYINSTALL=/sphenix/user/shulga/tpc2019_install```
0061 
0062 ```/sphenix/user/shulga/Work/...```
0063 
0064 - Scripts above will also generate bash files to submit all jobs, *_all bash scripts created above should be provided executable rights before that_*:
0065 ```
0066 ../run_all_jobs*  
0067 ```
0068 
0069 # Adding histograms from all files:
0070 The files contain histograms for 100 events each. Full map is ~10000 events. Thus, maps have to be integrated. 
0071 To make files smaller the Sumw2 arrays/matrices for histograms should not be stored.
0072 The tool to provide this functionality:
0073 ```
0074 add_histos.py
0075 ```