5.4.2 Determination of the correction Rts
Rts is the ratio of amplitudes of the empirical, high-energy
Michel tail spectrum to that of the simulated p ->
e n ( g ) spectrum. The correction Rts accounts
for the arbitrary number of events of the simulated p -> e n ( g )
spectrum. Rts is determined by a weighted least-squares fit
in which the sum of the simulated p -> e n ( g ) spectrum and the
experimentally measured background spectrum of Michel events is fit to the
experimentally measured spectrum, which contains both p -> e n ( g )
and Michel events. Initially, three parameters were allowed to vary in the fit.
These were the overall normalization, the relative amplitudes of the simulated
signal to measured background spectra Rts, and a scaling
factor for the x- axis. After the initial fit, the x-axis scaling factor
was held constant and only the first two parameters were allowed to vary.
Figure 5.10 shows the experimental spectrum overlaid with the combination
simulated/empirical spectrum after the fit. The correction factor
Rts was determined by the fit to be (5.87 ± 0.11)
× 10-2 where the statistical uncertainty was returned by the
CERNLIB fitting routine.