Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
stopping muons
|
|
- use pd.Series.where() instead of DataFrame.loc() to speed things up in
tag_michels
- don't set y limits when plotting bias and resolution for stopping
muons
|
|
- add get_multinomial_prob() function to stats.py
- add plot_hist2_data_mc() function to do the normal particle id plot
but also print p values
- other small bug fixes
|
|
This commit adds the new file sddm/stats.py to and adds a function to
correctly sample a Monte Carlo histogram when computing p-values. In
particular, I now take into account the uncertainty on the total number
of expected events by drawing from a gamma distribution, which is the
posterior of the Poisson likelihood function with a prior of 1/lambda.
|
|
|
|
- only look at muons with nhit < 4000 and udotr < -0.5
- switch from energy1 -> ke
|
|
This commit adds a first draft of a script to plot the michel energy
distribution and particle id histograms for data and Monte Carlo and to
plot the energy bias and resolution for stopping muons.
|
|
|
|
|
|
|
|
This commit adds a first draft of a script called chi2. This script calculates
a chi2 for the null hypothesis test if the events in the energy range 20 MeV -
10 GeV match what we expect from atmospheric neutrino events.
|
|
|
|
This commit updates get_events() to require at least 1 nhit trigger to fire.
The reason for this is that after looking at a small fraction of the data I
noticed a bunch of instrumental events that weren't getting tagged in run
10141. They looked sort of like neck events and were surrounded by hundreds of
orphaned PMT hits. My best guess is that they really were neck events but the
neck PMT hits and the hits in the lower hemisphere were erroneously not getting
built into the events.
Luckily, all of these events failed the psi cut, but it's not great to rely on
such a high level cut to cut these events. One other thing I noticed was that
these events were triggered mostly by MISS, OWLEL, and OWLEH. Therefore I
thought it might be a good idea to require events to have at least 1 NHIT
trigger. To test whether the NHIT triggers were reliably firing before the
other triggers I looked at all muon events which *didn't* have an NHIT trigger
fire. All of them appeared to be falsely tagged neck events so I'm fairly
confident that the NHIT triggers do reliably fire before the other triggers for
physics events.
|
|
This commit updates cat-grid-jobs to just add all the fits at once at the end
instead of continuously resizing the fits dataset. The reason for this is that
I noticed that several fit results files would occasionally have a large block
of the fits be set to all zeros. I have no idea how this happened, but I
suspect it might have been a bug with resizing the dataset so many times.
|
|
This commit updates cat-grid-jobs to change the reprocessed attribute to be 1
instead of True since the previous value was giving the following warning:
/usr/lib64/python2.7/site-packages/tables/attributeset.py:298: DataTypeWarning: Unsupported type for attribute 'reprocessed' in node '/'. Offending HDF5 class: 8
value = self._g_getattr(self._v_node, name)
when opening them up with pandas read_hdf() function.
|
|
|
|
|
|
|
|
|
|
|
|
instrumentals
This commit updates the contamination analysis scripts to take into account the
fact that we only fit a fraction of some of the instrumental events.
Based on the recent rate at which my jobs have been running on the grid,
fitting all the events would take *way* too long. Therefore, I'm now planning
to only fit 10% of muon, flasher, and neck events. With this commit the
contamination analysis will now correctly take into account the fact that we
aren't fitting all the instrumental events.
|
|
|
|
- label instrumentals in get_events()
- tag michels and stopping muons in get_events()
- update submit-grid-jobs to use get_events()
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
- print out mean acceptance fraction and autocorrelation time
- print standard normal distribution pdf with pull plot histograms
|
|
This commit updates the step size used for the MCMC in the contamination
analysis to 0.5 times the error returned by scanning near the minimum. I ran
some tests and this seemed to be pretty efficient compared to either the full
error or 0.1 times the error. I also reduced the number of workers to 10.
|
|
|
|
This commit updates the plot_energy.py module to not import from plot globally
since that means matplotlib gets pulled in by submit-grid-jobs and that causes
errors on the grid login nodes.
|
|
- fix Constraint.renormalize_no_fix() which could enter an infinite loop if the
fixed parameter was greater than 1 - EPSILON
- don't divide by psi twice in get_events()
- only use prompt events and cut on nhit_cal < 100
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
This commit contains the following updates:
- remove hack to get rid of low energy events in plot-energy since while
writing the unidoc I realized it's not necessary now that we add +100 to
multi-particle fits
- update Ockham factor to use an energy resolution of 5%
- update submit-grid-jobs to submit jobs according to the following criteria:
- always submit prompt events with no data cleaning cuts
- submit 10% of prompt flasher events
- submit all other prompt events
- submit followers only if they have no data cleaning cuts
- update submit-grid-jobs to place the nhit cut of 100 on the calibrated nhit
|
|
This commit contains the following small updates:
- create a setup_matplotlib() function to set up matplotlib correctly depending
on if we are saving the plots or just displaying them
- change default font size to 12 when displaying plots
- switch to using logarithmic bins in plot-energy
- fix despine() function when x axis is logarithmic
|
|
This commit updates utils/sddm/__init__.py to not import everything by default.
The reason is that on the open science grid login machine they don't have the
module scipy.stats by default.
|