Age | Commit message (Collapse) | Author |
|
|
|
|
|
|
|
|
|
To characterize the angular distribution of photons from an electromagnetic
shower I came up with the following functional form:
f(cos_theta) ~ exp(-abs(cos_theta-mu)^alpha/beta)
and fit this to data simulated using RAT-PAC at several different energies. I
then fit the alpha and beta coefficients as a function of energy to the
functional form:
alpha = c0 + c1/log(c2*T0 + c3)
beta = c0 + c1/log(c2*T0 + c3).
where T0 is the initial energy of the electron in MeV and c0, c1, c2, and c3
are parameters which I fit.
The longitudinal distribution of the photons generated from an electromagnetic
shower is described by a gamma distribution:
f(x) = x**(a-1)*exp(-x/b)/(Gamma(a)*b**a).
This parameterization comes from the PDG "Passage of particles through matter"
section 32.5. I also fit the data from my RAT-PAC simulation, but currently I
am not using it, and instead using a simpler form to calculate the coefficients
from the PDG (although I estimated the b parameter from the RAT-PAC data).
I also sped up the calculation of the solid angle by making a lookup table
since it was taking a significant fraction of the time to compute the
likelihood function.
|
|
|
|
To fit the path of muons and electrons I use the Karhunen-Loeve expansion of a
random 2D walk in the polar angle in x and y. This allows you to decompose the
path into a sum over sine functions whose coefficients become random variables.
The nice thing about fitting the path in this way is that you can capture
*most* of the variation in the path using a small number of variables by only
summing over the first N terms in the expansion and it is easy to calculate the
probability of the coefficients since they are all uncorrelated.
|
|
This commit contains code to fit for the energy, position, and direction of
muons in the SNO detector. Currently, we read events from SNOMAN zebra files
and fill an event struct containing the PMT hits and fit it with the Nelder
Mead simplex algorithm from GSL.
I've also added code to read in ZEBRA title bank files to read in the DQXX
files for a specific run. Any problems with channels in the DQCH and DQCR banks
are flagged in the event struct by masking in a bit in the flags variable and
these PMT hits are not included in the likelihood calculation.
The likelihood for an event is calculated by integrating along the particle
track for each PMT and computing the expected number of PE. The charge
likelihood is then calculated by looping over all possible number of PE and
computing:
P(q|n)*P(n|mu)
where q is the calibrated QHS charge, n is the number of PE, and mu is the
expected number of photoelectrons. The latter is calculated assuming the
distribution of PE at a given PMT follows a Poisson distribution (which I think
should be correct given the track, but is probably not perfect for tracks which
scatter a lot).
The time part of the likelihood is calculated by integrating over the track for
each PMT and calculating the average time at which the PMT is hit. We then
assume the PDF for the photons to arrive is approximately a delta function and
compute the first order statistic for a given time to compute the probability
that the first photon arrived at a given time. So far I've only tested this
with single tracks but the method was designed to be easy to use when you are
fitting for multiple particles.
|
|
|
|
circular disk
|
|
|