aboutsummaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2018-11-26update electron range tablestlatorre
This commit also adds a script to calculate the CSDA range for electrons from a table of the stopping power as a function of energy. We need this script since the NIST ESTAR website will only output the stopping power above a certain energy threshold (1 GeV for electrons). See https://physics.nist.gov/PhysRefData/Star/Text/ESTAR.html.
2018-11-25add a separate `dx_shower` parameter for the spacing of the shower track ↵tlatorre
integral
2018-11-25add a script to plot the fit results as a function of energytlatorre
2018-11-25speed up fast likelihood calculationtlatorre
This commit speeds up the fast likelihood calculation by only computing the time PDF for a single photon. Since the majority of the time in the fast likelihood calculation is spent computing the time PDF this should speed things up by quite a bit. I suspect this won't have a big effect on the likelihood value, but I should do some more testing.
2018-11-25speed up particle inittlatorre
2018-11-25update likelihood to make sure we integrate over at least 100 pointstlatorre
2018-11-25add shower photons to fast likelihood calculationtlatorre
2018-11-21update TODOtlatorre
2018-11-21add tests for norm() and norm_cdf()tlatorre
2018-11-21speed up normalize() and interp[12]d()tlatorre
2018-11-21update TODOtlatorre
2018-11-17add some commentstlatorre
2018-11-17add guess_time() function to approximate the PMT hit timetlatorre
This function is only used when the expected number of photons reaching a PMT is *very* small. In this case, we still need to estimate the PMT hit time PDF for indirect light which is modelled as a flat distribution starting at the time where the PMT is most likely to be hit from direct light. Since we compute the most likely time for a PMT to be hit from direct light by computing the integral of the expected charge times the time and then dividing by the total charge, when the total charge is very small this can introduce large errors. Note that this code already existed but it was computed in the likelihood function. This commit just moves it to its own function to make things look nicer.
2018-11-17speed up likelihood function and switch to using fixed dxtlatorre
This commit speeds up the likelihood function by about ~20% by using the precomputed track positions, directions, times, etc. instead of interpolating them on the fly. It also switches to computing the number of points to integrate along the track by dividing the track length by a specified distance, currently set to 1 cm. This should hopefully speed things up for lower energies and result in more stable fits at high energies.
2018-11-16update README with instructions for installing gsl and nlopttlatorre
2018-11-14update plot.py to plot the distribution of fit timestlatorre
2018-11-14update TODOtlatorre
2018-11-14update TODO and small updates to likelihood calculationtlatorre
2018-11-14speed up get_path_length() by not computing the norm twicetlatorre
2018-11-14update plot.pytlatorre
2018-11-14update plot.pytlatorre
2018-11-14update plot.py to plot the log likelihood ratiotlatorre
2018-11-14update TODOtlatorre
2018-11-14update yaml format to store fit results as a dictionary indexed by the ↵tlatorre
particle id
2018-11-14update README and TODOtlatorre
2018-11-14fix some compiler warningstlatorre
2018-11-14speed things up againtlatorre
This commit speeds up the likelihood calculation by returning zero early if the angle between the PMT and the track is far from the Cerenkov angle. Specifically we check to see that the angle is 5 "standard deviations" away. Where the standard deviation is taken to be the RMS width of the angular distribution.
2018-11-14speed things up by skipping zero valuestlatorre
2018-11-14initialize static arraystlatorre
2018-11-11update likelihood function to fit electrons!tlatorre
To characterize the angular distribution of photons from an electromagnetic shower I came up with the following functional form: f(cos_theta) ~ exp(-abs(cos_theta-mu)^alpha/beta) and fit this to data simulated using RAT-PAC at several different energies. I then fit the alpha and beta coefficients as a function of energy to the functional form: alpha = c0 + c1/log(c2*T0 + c3) beta = c0 + c1/log(c2*T0 + c3). where T0 is the initial energy of the electron in MeV and c0, c1, c2, and c3 are parameters which I fit. The longitudinal distribution of the photons generated from an electromagnetic shower is described by a gamma distribution: f(x) = x**(a-1)*exp(-x/b)/(Gamma(a)*b**a). This parameterization comes from the PDG "Passage of particles through matter" section 32.5. I also fit the data from my RAT-PAC simulation, but currently I am not using it, and instead using a simpler form to calculate the coefficients from the PDG (although I estimated the b parameter from the RAT-PAC data). I also sped up the calculation of the solid angle by making a lookup table since it was taking a significant fraction of the time to compute the likelihood function.
2018-11-04delete solid_angle_fast since it wasn't workingtlatorre
2018-10-22update plot.pytlatorre
2018-10-22update plot.py to plot the angular resolutiontlatorre
2018-10-21set default number of points to 500tlatorre
2018-10-21fix a typo in snogentlatorre
2018-10-21add a fast solid angle approximation to speed up the fast likelihood calculationtlatorre
2018-10-21speed up get_total_charge_approx() by precomputing some variablestlatorre
2018-10-21fix use of uninitialized variablestlatorre
2018-10-19don't call path_init() when doing the fast likelihood calculation to speed ↵tlatorre
things up
2018-10-19add MIN_RATIO_FAST to speed up the "fast" likelihood calculationtlatorre
2018-10-19speed up get_total_charge_approx()tlatorre
2018-10-19epsrel -> npointstlatorre
2018-10-19add interp2d() for fast bilinear 2D interpolationtlatorre
2018-10-19speed up sin_theta calculation by replacing sin(acos(cos_theta)) with ↵tlatorre
sqrt(1-pow(cos_theta,2))
2018-10-19update path integral to use a fixed number of pointstlatorre
I noticed when fitting electrons that the cquad integration routine was not very stable, i.e. it would return different results for *very* small changes in the fit parameters which would cause the fit to stall. Since it's very important for the minimizer that the likelihood function not jump around, I am switching to integrating over the path by just using a fixed number of points and using the trapezoidal rule. This seems to be a lot more stable, and as a bonus I was able to combine the three integrals (direct charge, indirect charge, and time) so that we only have to do a single loop. This should hopefully make the speed comparable since the cquad routine was fairly effective at only using as many function evaluations as needed. Another benefit to this approach is that if needed, it will be easier to port to a GPU.
2018-10-18fix a bug in get_total_charge_approx()tlatorre
This commit fixes a bug which was double counting the pmt response when computing the direct charge and incorrectly multiplying the reflected charge by the pmt response. I think this was just a typo left in when I added the reflected charge.
2018-10-18update path_init() to check for a divide by zerotlatorre
This commit updates path_init() to check that beta > 0 before dividing by it to compute the time. Previously when fitting electrons it would occasionally divide by zero which would cause the inf to propagate all the way through the likelihood function.
2018-10-18make sure that the kinetic energy is zero at the last steptlatorre
Occasionally when fitting electrons the kinetic energy at the last step would be high enough that the electron never crossed the BETA_MIN threshold which would cause the gsl routine to throw an error. This commit updates particle_init() to set the kinetic energy at the last step to zero to make sure that we can bisect the point along the track where the speed drops to BETA_MIN.
2018-10-18hardcode the density when computing dE/dxtlatorre
Since we only have the range and dE/dx tables for light water for electrons and protons it's not correct to use the heavy water density. Also, even though we have both tables for muons, currently we only load the heavy water table, so we hardcode the density to that of heavy water. In the future, it would be nice to load both tables and use the correct one depending on if we are fitting in the heavy or light water.
2018-10-18fix the likelihood function to return the *negative* log likelihood of the ↵tlatorre
path coefficients Previously I was adding the log likelihood of the path coefficients instead of the *negative* log likelihood! When fitting electrons this would sometimes cause the fit to become unstable and continue increasing the path coefficients without bound since the gain in the likelihood caused by increasing the coefficients was more than the loss caused by a worse fit to the PMT data. Doh!