Simple Job Submission
Overview
Breaking up the dataset and submitting it across multiple core for faster processing
Is currently implemented in a somewhat hacky way but is helpful with slow fitting for example, when extracting time constants from large dataset's (e.i LOLX undershoot)
Includes a shell script to launch multiple jobs and a root macro to merge the resulting root files.
Tested
Currently has been tested on the example.json dataset, and looks the same.
Note, change the XX to the desired number of jobs
./scripts/vanwftk.sh experiments/example.json XX
root -l "macros/MergeTrees.C(\"ntp/TEST_FILE_DIGITIZER_SINGLE_CHANNEL/run00321.mid.gz.TTree.root\",XX)"
DataOutTree->Draw("ch0.pulse.fit.amp>>h(400,-2e6,0)")
Edited by Iain