Age | Commit message (Collapse) | Author |
|
|
|
|
|
This commit updates dm-search to fix a bug where I was returning lists
from get_limits() but then comparing the best fit and discovery lists as
if they were numpy arrays.
I also updated how I calculate the expected number of events based on
the results from doing a toy MC.
|
|
|
|
This commit updates the python code to work with python 3 and with a
newer version of matplotlib.
- zip_longest -> izip_longest
- fix tick marks for log plots
- scipy.misc -> scipy.special
|
|
|
|
|
|
|
|
This commit adds a script called print-event-rate which calculates the
event rate per year from the GENIE MCPL files. The livetime comes from
the autosno run_info.log file. The output of the script is a latex table
that I included in my thesis.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
This commit updates the chi2 and dm-search scripts to add the ability to
pass a run list on the comand line.
|
|
|
|
|
|
|
|
|
|
This commit updates get_events() to calculate the livetime based on both
the number of pulse gt events and the 10 MHz clock and to return it in a
dictionary stored with the dataframe.
I also update dm-search so that the results are now reported as a
function of events/cm^3/s.
Also updated radius cut to be the AV radius.
|
|
|
|
This commit updates both the chi2 and dm-search scripts to run nlopt at
the end of the MCMC to find the best fit point.
The reason for this is that for the dm-search script, we rely on the
best fit point being correct when doing the discovery threshold
analysis. To really get the best fit point with the MCMC we would need
to run a ridiculous number of steps, so it's better to run a fewer
amount of steps to get close and then run the minimization from there.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
This commit fixes an unnecessary call to set the index on the ev
dataframe. This was causing issues when trying to process run 11903
since it didn't have any fits to merge into the ev dataframe.
|
|
|
|
|
|
|
|
|
|
|
|
This commit updates the retrigger cut to cut events where the previous
event is missing, so that even if I forget to run the analysis with all
the orphan events included, we will cut events potentially coming after
an instrumental or muon that got cut by the junk cut.
|
|
This commit updates get_events() to only cut orphans instead of all JUNK
events before calculating time differences. The reason is that some
large instrumental events (or muons) can get flagged as JUNK events
since they sometimes have multiple hits from the same PMT. If we remove
them before calculating the time difference, the follower might not get
cut. See run 10040 GTID 349491 for an example.
|
|
|
|
This commit fixes a typo in submit-grid-jobs-queue by fixing get_job()
-> get_entry(). I also decided not to remove held jobs for now, because
it's easier for me to see how many are held when running condor_q.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
This commit updates the chi2 script to add scale factors for the
atmospheric neutrino scale and muon scale parameters. The reason for
this is that these parameters should now have a much easier
interpretation than before. Now, the atmospheric neutrino scale
parameter is relative to the expected atmospheric neutrino flux (i.e. we
expect the fit to return something close to 1), and the muon scale
parameter is the total number of expected muons in our sample, which is
exactly what the data cleaning analysis gives us.
|