Age | Commit message (Collapse) | Author |
|
|
|
results
|
|
This commit updates submit-grid-jobs so that it keeps a database of jobs. This
allows the script to make sure that we only have a certain number of jobs in
the job queue at a single time and automatically resubmitting failed jobs. The
idea is that it can now be run once to add jobs to the database:
$ submit-grid-jobs ~/zdabs/SNOCR_0000010000_000_p4_reduced.xzdab.gz
and then be run periodically via crontab:
PATH=/usr/bin:$HOME/local/bin
SDDM_DATA=$HOME/sddm/src
DQXX_DIR=$HOME/dqxx
0 * * * * submit-grid-jobs --auto --logfile ~/submit.log
Similarly I updated cat-grid-jobs so that it uses the same database and can
also be run via a cron job:
PATH=/usr/bin:$HOME/local/bin
SDDM_DATA=$HOME/sddm/src
DQXX_DIR=$HOME/dqxx
0 * * * * cat-grid-jobs --logfile cat.log --output-dir $HOME/fit_results
I also updated fit so that it keeps track of the total time elapsed including
the initial fits instead of just counting the final fits.
|
|
This commit fixes a small bug in cat-grid-jobs which was causing it to print
the wrong filename when there was no git_sha1 attrs in the HDF5 file.
|
|
This commit updates cat-grid-jobs to only warn once about a mismatch between
the SHA1 of the current zdab-cat program and the grid results, and also cleans
up some of the output.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
This commit updates the cat-grid-jobs script to call zdab-cat on the zdab file
first to get the data cleaning words and SNOMAN fitter results for every single
event (regardless of if it's greater than 100 nhit for example), and then add
the fit results from the grid jobs output.
|
|
|
|
|
|
|