Radioxenon measurements use the net-count method to determine the activity for the radioxenon isotopes of interest. Detected decay events are plotted on a beta-gamma coincidence histogram and events are tallied inside regions-of-interest specific to a given radioxenon isotope. The boundaries of these regions are based on both the resolution of the detector and the physics of the emitted radiation of the radioisotope. Nuclear detector gain drifts can cause the energy calibration of the detector to be incorrect and the decay events to fall outside the region, causing inaccurate activity measurements if the detector gain drifts too much. One possible method to mitigate the effect of gain drifts is to increase the size of the regions-of-interest. This presentation will demonstrate the effect gain shifts have on activity calculations, how larger regions decrease this effect, and the impact larger regions have on the sensitivity of the measurement.
This presentation provides new methods for improving nuclear test monitoring and verification by providing alternative activity measurement analysis techniques.