Speaker
Description
The duration of infrasonic signals recorded by sensors on the Earth’s surface increases with source-to-receiver distance, due to the increased separation of the fastest and slowest portions of the wavefield. 43 infrasound signals exhibiting high signal-to-noise ratios within the 0.32 to 1.28Hz passband, and generated by well-constrained ground truth events, have been analyzed using an algorithm developed to consistently measure signal durations. The signal database contains recordings from microbarograph arrays of the International Monitoring System at distances of between 20 and 6300km from the source. The results indicate that signal duration is dependent upon both source-to-receiver distance and the along-path variability in the strength of the stratospheric waveguide, as measured by the ratio of effective sound speeds in the stratosphere and at the ground. For paths with low along-path waveguide variability the signal duration, D (s), exhibits a weak linear relationship with source-to-receiver distance, r (km), such that D can be estimated as D = 0.278r+122. The signal durations are reduced when the propagation paths exhibit waveguide strength variability. Ray-tracing simulations indicate that horizontal gradients in waveguide strength act to restrict the signal celerities that can be supported by stratospheric waveguide propagation.