aboutsummaryrefslogtreecommitdiff
path: root/src
AgeCommit message (Collapse)Author
2018-12-14add a function to compute combinations with replacementtlatorre
2018-12-14fix help stringtlatorre
2018-12-13add some more comments and fix a memory leaktlatorre
2018-12-13add -fdiagnostics-color to Makefiletlatorre
2018-12-13add some commentstlatorre
2018-12-13update fit.c to fit multiple verticestlatorre
This commit adds a new function fit_event2() to fit multiple vertices. To seed the fit, fit_event2() does the following: - use the QUAD fitter to find the position and initial time of the event - call find_peaks() to find possible directions for the particles - loop over all possible unique combinations of the particles and direction vectors and do a "fast" minimization The best minimum found from the "fast" minimizations is then used to start the fit. This commit has a few other updates: - adds a hit_only parameter to the nll() function. This was necessary since previously PMTs which weren't hit were always skipped for the fast minimization, but when fitting for multiple vertices we need to include PMTs which aren't hit since we float the energy. - add the function guess_energy() to guess the energy of a particle given a position and direction. This function estimates the energy by summing up the QHS for all PMTs hit within the Cerenkov cone and dividing by 6. - fixed a bug which caused the fit to freeze when hitting ctrl-c during the fast minimization phase.
2018-12-13update find_peaks_array() to return peaks in sorted ordertlatorre
2018-12-13add function to compute unique direction vectors for a multi particle fittlatorre
2018-12-11fix some compiler warningstlatorre
2018-12-11add a function to find peaks using a Hough transformtlatorre
2018-12-07add the QUAD fittertlatorre
2018-12-04don't quit when maxtime is reachedtlatorre
2018-12-04fix bugtlatorre
2018-12-04add a command line parameter to control the maximum time of the fittlatorre
This commit adds a parameter to stop the fit if it takes longer than a certain period of time in seconds. This parameter can be set on the command line. For example, to limit fits to 10 minutes: $ ./fit FILENAME --max-time 600.0
2018-12-04set a stopping criterion of 1% for the fit parameterstlatorre
2018-12-03update run-fittlatorre
2018-12-03fix test-zebratlatorre
2018-12-03add script to run multiple fitstlatorre
2018-12-03add a goodness of fit parameter psi to the fittlatorre
2018-11-30sizeof()/sizeof() -> LEN()tlatorre
2018-11-30nll_muon -> nll and nll -> nopt_nlltlatorre
2018-11-30add ability to fit for multiple verticestlatorre
2018-11-28update sno_charge.ctlatorre
This commit adds lots of comments to sno_charge.c and makes a couple of other changes: - use interp1d() instead of the GSL interpolation routines - increase MAX_PE to 100 I increased MAX_PE because I determined that it had a rather large impact on the likelihood function for 500 MeV electrons. This unfortunately slows down the initialization by a lot. I think I could speed this up by convolving the single PE charge distribution with a gaussian *before* convolving the charge distributions to compute the charge distributions for multiple PE.
2018-11-27add separate CHARGE_FRACTION variables for electrons and muonstlatorre
2018-11-27change PSUP_REFLECTION_TIME to 80 nstlatorre
See Bryce Moffat's thesis page 64.
2018-11-27add rayleigh scatteringtlatorre
This commit adds Rayleigh scattering to the likelihood function. The Rayleigh scattering lengths come from rsp_rayleigh.dat from SNOMAN which only includes photons which scattered +/- 10 ns around the prompt peak. The fraction of light which scatters is treated the same in the likelihood as reflected light, i.e. it is uniform across all the PMTs in the detector and the time PDF is assumed to be a constant for a fixed amount of time after the prompt peak.
2018-11-27a bunch of small changes to speed things uptlatorre
2018-11-27update dx_shower to 10 cm to speed things uptlatorre
2018-11-26update .gitignoretlatorre
2018-11-26update electron range tablestlatorre
This commit also adds a script to calculate the CSDA range for electrons from a table of the stopping power as a function of energy. We need this script since the NIST ESTAR website will only output the stopping power above a certain energy threshold (1 GeV for electrons). See https://physics.nist.gov/PhysRefData/Star/Text/ESTAR.html.
2018-11-25add a separate `dx_shower` parameter for the spacing of the shower track ↵tlatorre
integral
2018-11-25speed up fast likelihood calculationtlatorre
This commit speeds up the fast likelihood calculation by only computing the time PDF for a single photon. Since the majority of the time in the fast likelihood calculation is spent computing the time PDF this should speed things up by quite a bit. I suspect this won't have a big effect on the likelihood value, but I should do some more testing.
2018-11-25speed up particle inittlatorre
2018-11-25update likelihood to make sure we integrate over at least 100 pointstlatorre
2018-11-25add shower photons to fast likelihood calculationtlatorre
2018-11-21add tests for norm() and norm_cdf()tlatorre
2018-11-21speed up normalize() and interp[12]d()tlatorre
2018-11-17add some commentstlatorre
2018-11-17add guess_time() function to approximate the PMT hit timetlatorre
This function is only used when the expected number of photons reaching a PMT is *very* small. In this case, we still need to estimate the PMT hit time PDF for indirect light which is modelled as a flat distribution starting at the time where the PMT is most likely to be hit from direct light. Since we compute the most likely time for a PMT to be hit from direct light by computing the integral of the expected charge times the time and then dividing by the total charge, when the total charge is very small this can introduce large errors. Note that this code already existed but it was computed in the likelihood function. This commit just moves it to its own function to make things look nicer.
2018-11-17speed up likelihood function and switch to using fixed dxtlatorre
This commit speeds up the likelihood function by about ~20% by using the precomputed track positions, directions, times, etc. instead of interpolating them on the fly. It also switches to computing the number of points to integrate along the track by dividing the track length by a specified distance, currently set to 1 cm. This should hopefully speed things up for lower energies and result in more stable fits at high energies.
2018-11-14update TODO and small updates to likelihood calculationtlatorre
2018-11-14speed up get_path_length() by not computing the norm twicetlatorre
2018-11-14update yaml format to store fit results as a dictionary indexed by the ↵tlatorre
particle id
2018-11-14fix some compiler warningstlatorre
2018-11-14speed things up againtlatorre
This commit speeds up the likelihood calculation by returning zero early if the angle between the PMT and the track is far from the Cerenkov angle. Specifically we check to see that the angle is 5 "standard deviations" away. Where the standard deviation is taken to be the RMS width of the angular distribution.
2018-11-14speed things up by skipping zero valuestlatorre
2018-11-14initialize static arraystlatorre
2018-11-11update likelihood function to fit electrons!tlatorre
To characterize the angular distribution of photons from an electromagnetic shower I came up with the following functional form: f(cos_theta) ~ exp(-abs(cos_theta-mu)^alpha/beta) and fit this to data simulated using RAT-PAC at several different energies. I then fit the alpha and beta coefficients as a function of energy to the functional form: alpha = c0 + c1/log(c2*T0 + c3) beta = c0 + c1/log(c2*T0 + c3). where T0 is the initial energy of the electron in MeV and c0, c1, c2, and c3 are parameters which I fit. The longitudinal distribution of the photons generated from an electromagnetic shower is described by a gamma distribution: f(x) = x**(a-1)*exp(-x/b)/(Gamma(a)*b**a). This parameterization comes from the PDG "Passage of particles through matter" section 32.5. I also fit the data from my RAT-PAC simulation, but currently I am not using it, and instead using a simpler form to calculate the coefficients from the PDG (although I estimated the b parameter from the RAT-PAC data). I also sped up the calculation of the solid angle by making a lookup table since it was taking a significant fraction of the time to compute the likelihood function.
2018-11-04delete solid_angle_fast since it wasn't workingtlatorre
2018-10-21set default number of points to 500tlatorre