Age | Commit message (Collapse) | Author |
|
This commit adds an sddm python package to the utils/ folder. This allows me to
consolidate code used across all the various scripts. This package is now
installed by default to /home/tlatorre/local/lib/python2.7/site-packages so you
should add the following to your .bashrc file:
export PYTHONPATH=$HOME/local/lib/python2.7/site-packages/:$PYTHONPATH
before using the scripts installed to ~/local/bin.
|
|
|
|
|
|
This commit updates submit-grid-jobs so that jobs are now submitted first in
order of priority and then timestamp. So any jobs with a higher priority will
be submitted preferentially first, and within each priority level jobs are
submitted in the order they are added to the database.
|
|
This commit updates submit-grid-jobs to look for jobs in the RETRY state and to
retry them if the number of retries is less than --max-retries. This way if you
ever want to retry a bunch of jobs you can update the database:
$ sqlite3 ~/state.db
sqlite> update state set state = 'RETRY' where state == 'FAILED';
And then rerun submit-grid-jobs with more retries:
$ submit-grid-jobs --max-retries 10 ---auto
|
|
|
|
|
|
|
|
multi-particle fits
|
|
|
|
more than one fit per particle combo
|
|
|
|
|
|
|
|
This commit adds four scripts:
1. calculate-atmospheric-oscillations
This script uses an independent python package called nucraft to calculate the
neutrino oscillation probabilities for atmospheric neutrinos. These
probabilities are calculated as a function of energy and cosine of the zenith
angle and stored in text files.
2. apply-atmospheric-oscillations
This script combines the high energy 2D atmospheric neutrino flux from Barr and
the low energy 1D flux from Battistoni and then applies neutrino oscillations
to them. The results are then stored in new flux files that can be used with a
modified version of GENIE to simulate the oscillated atmospheric neutrino flux
from 10 MeV to 10 GeV.
3. plot-atmospheric-fluxes
This is a simple script to plot the atmospheric flux files produced by
apply-atmospheric-oscillations.
4. plot-atmospheric-oscillations
This is a simple script to plot the 2D neutrino oscillation probabilities.
|
|
|
|
This commit adds the --save command line argument to plot-energy to save either
the corner plots or the energy distribution plots. It also updates the code to
make plots similar to plot-fit-results.
In addition there are a bunch of other small changes:
- plot the theoretical Michel spectrum for free muons
- energy plots now assume there are only a max of 2 particles fit for each event
- display particle IDs as letters instead of numbers, i.e. 2022 -> eu
|
|
|
|
|
|
the database
Also add a -r or --reprocess command line option to reprocess runs which are
already in the database.
|
|
results
|
|
|
|
To select stopping muons we simply look for the muons before a Michel event.
The muon distance is calculated by first projecting the muon fit back to the
PSUP along the fitted direction and then taking the distance between this point
and the fitted position of the Michel event. I then calculate the expected
kinetic energy of the muon by using the muon lookup tables of the CSDA range to
convert the distance to an energy.
I also changed a few other things like changing as_index=False ->
group_keys=False when grouping events. The reason for this is just that if we
do this we don't have to reset the index and drop the new index after calling
apply().
I also fixed a small bug I had introduced recently where I selected only prompt
events before finding the michel events and atmospheric events.
This commit updates the plot-energy script to plot the energy bias and resolution for stopping muons by computing the expected
|
|
This commit updates the dc script to calculate the instrumental contamination
to now treat all 4 high level variables as correlated for muons. Previously I
had assumed that the reconstructed radius was independent from udotr, z, and
psi, but based on the corner plots it seems like the radius is strongly
correlated with udotr.
I also updated the plotting code when using the save command line argument to
be similar to plot-fit-results.
|
|
This commit adds a script to calculate the background contamination using a
method inspired by the bifurcated analysis method used in SNO. The method works
by looking at the distribution of several high level variables (radius, udotr,
psi, and reconstructed z position) for events tagged by the different data
cleaning cuts and assuming that any background events which sneak past the data
cleaning cuts will have a similar distribution (for certain backgrounds this is
assumed and for others I will actually test this assumption. For more details
see the unidoc). Then, by looking at the distribution of these high level
variables for all the untagged events we can use a maximum likelihood fit to
determine the residual contamination.
There are also a few other updates to the plot-energy script:
- add a --dc command line argument to plot corner plots for the high level
variables used in the contamination analysis
- add a fudge factor to the Ockham factor of 100 per extra particle
- fix a bug by correctly setting the final kinetic energy to the sum of the
individual kinetic energies instead of just the first particle
- fix calculation of prompt events by applying at the run level
|
|
This commit updates submit-grid-jobs so that it keeps a database of jobs. This
allows the script to make sure that we only have a certain number of jobs in
the job queue at a single time and automatically resubmitting failed jobs. The
idea is that it can now be run once to add jobs to the database:
$ submit-grid-jobs ~/zdabs/SNOCR_0000010000_000_p4_reduced.xzdab.gz
and then be run periodically via crontab:
PATH=/usr/bin:$HOME/local/bin
SDDM_DATA=$HOME/sddm/src
DQXX_DIR=$HOME/dqxx
0 * * * * submit-grid-jobs --auto --logfile ~/submit.log
Similarly I updated cat-grid-jobs so that it uses the same database and can
also be run via a cron job:
PATH=/usr/bin:$HOME/local/bin
SDDM_DATA=$HOME/sddm/src
DQXX_DIR=$HOME/dqxx
0 * * * * cat-grid-jobs --logfile cat.log --output-dir $HOME/fit_results
I also updated fit so that it keeps track of the total time elapsed including
the initial fits instead of just counting the final fits.
|
|
|
|
|
|
|
|
This commit updates plot-fit-results to use the median when plotting the energy
and position bias and the interquartile range (times 1.35) when plotting the
energy and position resolution. The reason is that single large outliers for
higher energy muons were causing the energy bias and resolution to no longer
represent the central part of the distribution well.
|
|
This commit fixes a small bug in cat-grid-jobs which was causing it to print
the wrong filename when there was no git_sha1 attrs in the HDF5 file.
|
|
I noticed that many of my jobs were failing with the following error:
module: command not found
My submit description files *should* only be selecting nodes with modules because of this line:
requirements = (HAS_MODULES =?= true) && (OSGVO_OS_STRING == "RHEL 7") && (OpSys == "LINUX")
which I think I got from
https://support.opensciencegrid.org/support/solutions/articles/12000048518-accessing-software-using-distributed-environment-modules.
I looked up what the =?= operator does and it's a case sensitive search. I also
found another site
(https://support.opensciencegrid.org/support/solutions/articles/5000633467-steer-your-jobs-with-htcondor-job-requirements)
which uses the normal == operator. Therefore, I'm going to switch to the ==
operator and hope that fixes the issue.
|
|
This commit fixes a bug in gen-dark matter which was causing the positions to
all be generated with positive x, y, and z values. Doh!
|
|
|
|
|
|
|
|
This commit updates the zdab-reprocess script with the following changes:
- don't run RAA
- don't run a bunch of the other fitters like FTI, etc.
- run the path and RSP fitters
- add --lower-nhit and --upper-nhit command line arguments to control the
primary nhit cut range
|
|
|
|
|
|
|
|
This commit updates plot-energy to select prompt events before applying the
data cleaning cuts. This fixes an issue where we might accidentally classify an
event as a prompt event even if it came after an event that was flagged by data
cleaning. For example, suppose there was a breakdown but for whatever reason
the event immediately after the breakdown wasn't tagged (ignoring the fact that
we apply a breakdown follower cut). If we apply the data cleaning first and
then the prompt event selection, that event would be a part of the prompt
events.
There are several other small updates to plot-energy:
- fix bug in 00-orphan cut
- make michel event selection a separate function
- make atmospheric tag into a separate function
|
|
This commit adds the sub_run variable to the ev array in the HDF5 output file
and updates plot-energy to order the events using the run and sub_run
variables. This fixes a potential issue where I was sorting by GTID before, but
the GTID can wrap around and so isn't guaranteed to put the events in the right
order.
|
|
|
|
|
|
|
|
This commit updates cat-grid-jobs to only warn once about a mismatch between
the SHA1 of the current zdab-cat program and the grid results, and also cleans
up some of the output.
|
|
This commit updates the submit-grid-jobs script to use my version of splitext()
which removes the full extension from the filename. This fixes an issue where
the output HDF5 files had xzdab in the name whenever the input file had the
file extension .xzdab.gz.
|
|
|
|
This commit fixes the FTS cut so that it returns 1 when the event is flagged as
failing the cut. Previously, the function returned the following:
not enough PMT pairs: 0 (fail)
median time > 6.8 ns: 0 (fail)
otherwise: 1 (pass)
This had two issues: the return value wasn't consistent with the rest of the
data cleaning cuts and it should pass if there aren't enough PMT pairs.
Now, I fixed the not enough PMT pairs case and made the return value consistent
with the rest of the data cleaning cuts:
not enough PMT pairs: 0 (pass)
median time > 6.8 ns: 1 (fail)
otherwise: 0 (pass)
|
|
|