The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) relies on innovation to enhance the capabilities of the Treaty’s verification regime as well as to help move the Treaty closer to universalization and entry into force. As the seventh event in the CTBT: Science and Technology conference series, SnT2023 will bring together well over 1000 scientists, technologists, academics, students, and representatives of the CTBTO’s policy making organs. In addition, representatives from the fields of research and development, science diplomacy, science advisory, media and advocacy are invited to attend the conference.
SnT2023 is scheduled to take place at the Hofburg Palace in Vienna, Austria, featuring virtual components for active online participation to support broader outreach and global inclusiveness. While restrictions on physical attendance at SnT2023 due to COVID-19 are not currently foreseen, the structure of the conference will be hybrid and remain flexible in order to adapt to any circumstances as needed.
The programmme of the High-Level Opening (HLO) is under preparation.
Interpretation in all UN languages will be available for in-person attendance in the Festsaal.
Welcome video
Opening
• Dr Robert Floyd, CTBTO Executive Secretary
High Level Remarks
• Ms Izumi Nakamitsu, United Nations Under-Secretary-General and High Representative for Disarmament Affairs (video message)
• His Excellency Mr Abshir Omar Jama Huruse, Minister of Foreign Affairs and International Cooperation of Somalia
• Mr Rafael Grossi, Director General of the International Atomic Energy Agency
• Mr Mayen Dut Wol, Undersecretary, Ministry of Foreign Affairs and International Cooperation of South Sudan
• Ms Alinne Olvera Martinez, Early-Career Women in STEM CTBTO Mentoring Programme
Keynote Address
• Ms Jill Hruby, Under Secretary of Energy for Nuclear Security and Administrator of the National Nuclear Security Administration of the United States
Musical Interlude
• Trio KlaVis
High Level Panel
“CTBT Science and Technology: Benefitting us all”
• His Excellency Mr Li Song, Permanent Representative of the People's Republic of China
• Ms Dwikorita Karnawati, Director of the Indonesian Agency for Meteorology, Climatology and Geophysics and Chair of the Intergovernmental Coordination Group of Indian Ocean Tsunami Warning and Mitigation System
• Mr Geoff Brumfiel, Senior Editor and Correspondent, Science Desk at NPR
• Ms Antonietta Rizzo, Head of Laboratory for Methods and Techniques for Nuclear Safety, Monitoring and Traceability, Italian National Agency for New Technologies, Energy and Sustainable Economic Development
• Ms Xyoli Perez Campos, Director of the International Monitoring System Division of the CTBTO
Master of Ceremonies: Ms Monika Jones, Freelance TV Anchor
Interpretation in all UN languages will be available for in-person attendance in the Festsaal.
L’objectif de cette table ronde de haut niveau est de souligner le rôle de la francophonie et le multilinguisme comme facilitateur des discussions internationales. Des ambassadeurs rompus à ces échanges apporteront leur ressenti sur le sujet.
Interpretation in all UN languages will be available for in-person attendance in the Festsaal.
The European Union (EU) remains a staunch supporter of the CTBT and its Organisation, both politically and financially. Promoting the universalisation and entry into force of the CTBT is a top priority for the EU, in line with its non-proliferation and disarmament policies and objectives.
In the framework of the EU Strategy against the Proliferation of Weapons of Mass Destruction, the EU has provided voluntary contributions to the CTBTO through eight EU Council Decisions to date, amounting to almost 30 million EURO, in order to strengthen its monitoring and verification capabilities. The extensive support provided by the European Union for the Treaty and its verification regime will be the focus of this panel discussion. In this regard, the benefits of technical and capacity-building programs funded by the European Union will be presented by those who benefit from the implementation and the output of the projects.
To be updated
[This panel discusses the mutual benefits between the CTBTO and Arabic-speaking countries. For many local experts engaging in the operation of IMS stations and serving at the National Data Centres (NDC) having the training and capacity-building materials in Arabic is beneficial. An excellent example was set with the NDC training for Arabic-speaking NDCs that was held in January 2020 at the Red Sea in Egypt. The NDCs in Arabic-speaking countries actively monitor nuclear explosion and prepare for On-site Inspection. Regional experts have made many significant contributions in using IMS data and advancing analysis methods for events in the region, such as the investigation of the unfortunate large atmospheric explosions of Beirut in 2020 or to studying natural disasters like the severe earthquakes in the Türkiye - Syria region in February 2023. Civil applications are also important for the region, IMS data are used for tsunami early warning in the Mediterranean. The organisation of the Integrated Field Experiment 2014 (IFE14) in Jordan provided important knowledge to prepare On-site Inspections, which will be relevant for future experiment.
The panel will be conducted in Arabic, giving an overview of recent experiences on the use of IMS data for verification purposes. The potential benefits for civil and scientific applications will also be discussed, in particular how the IMS technologies (waveform and radionuclide networks) can possibly complement other observation methods. The discussion will also cover the importance of regional cooperation between NDCs.]
English title: Regional capacity building for IMS station managers and operators in French-speaking countries in Africa
This panel discusses operational strategies and challenges faced by managers and operators of International Monitoring System stations in French-speaking countries in Africa. The objective is to strengthen the capabilities of operators in order overcome the challenges encountered in their countries.
Admittedly, the hurdles confronted by station operators and managers may differ from one country to another or even between station locations. Yet there may be common challenges for which experience exchange and synergies could offer solutions. One of the questions that will be addressed by the panel is how to initiate new operational strategies to improve the activities of station managers and operators in the region. Such strategies could focus on various elements such as, for example, digital connectivity, energy supply, equipment transportation, the security context. It could also investigate synergies between National Data Centres (NDC) in the region as a mean to strengthen capabilities of the stakeholders and the operation of IMS stations.
The panel will be conducted in French, with experienced station managers and operators from French-speaking countries in Africa, some of them also staff of NDCs.
Disasters affect populations on a global scale. The Latin American and the Caribbean (LAC) region, is frequently impacted by large earthquakes, tsunamis, volcanic eruptions, and extreme weather events (drought and tropical storms). The raising awareness and concern about the need for disaster risk mitigation, highlights the necessity for operating multi-technology platforms to understand better the hazards posed by natural phenomena.
Currently, IMS data are provided to 20 Tsunami Warning Centres (TWC) in 19 countries that signed a Tsunami Warning Agreement with the CTBTO, this includes two TWCs in the LAC region (Chile and Honduras). IMS data are delivered reliably, timely and securely. This contributes to the work of TWCs since 2005. As this civil application evolves, progresses are also achieved on scientific applications relevant to earthquake monitoring, volcano eruption monitoring for aviation and maritime safety.
The panel will be conducted in Spanish, giving an overview of recent experiences on the use of IMS data for TWCs in the LAC region and Spain. The potential benefits for civil and scientific applications will be discussed, in particular how the IMS technologies (waveform and radionuclide networks) can possibly complement other observation methods. The dialog will also assess how regional cooperation between National Data Centres (NDCs) among each other and with disaster warning centres may offer perspectives to further mitigate the risk of disasters.
In shallow-water environments long range propagation proceeds by repeated reflections from the surface and the bottom of the channel, as is the case for underwater sound of a wide spectral range, whose very low frequencies may propagate over large distances, without significant losses. In this paper, a 3-D benchmark model of the fluid wedge over an elastic bottom is applied to explain low frequency long range propagation in shallow water overlaying a sloping elastic-type seabed, such as a sedimentary rock exhibiting rigidity. The modeling approach is based on the modified method of generalized ray that furnishes a complete acoustic signal, thus comprised of contributions from all of the wave motions typical for the model (not only from the source signal and the regularly [specularly] reflected waves but also from the refracted [lateral] waves and the pseudo-Rayleigh and the Scholte interface waves), received in order of their arrivals at a large distance from the point source. When the source emits signals of a low frequency content, the contribution from the Scholte waves becomes dominant at large distances. Hence, low frequency long range propagation in a shallow-water wedge (coastal wedge) may indeed be governed by the Scholte waves.
Long term observations using hydrophones installed by the CTBTO in the Indian Ocean, suggest that noise levels increased from 2002 to around 2012. Since then they have been decreasing. While the increase in noise levels was related to growth in ship traffic, the reasons for the decrease are not known. This paper investigates the reasons for the decrease in two steps. The first step builds an acoustic model that uses shipping densities from the Automatic Identification System (AIS) for several years to simulate ship positions in the Indian Ocean. The ship positions are then used to model the low frequency noise levels due to cargo and tanker traffic. The second step further incorporates other inputs in the model, such as potential changes in ship velocities, and observations of long term oceanographic changes to investigate the drift in noise levels. Preliminary results suggest that a potential decrease in ship speeds is the reason for the reduction in noise levels.
The eastern Mediterranean Basin was always the main trade route and cultural exchange between the old-world continents. Being centered in the middle of the three continents, the region is often referred to as the cradle of civilization. Meanwhile, it has suffered several catastrophic events, including earthquakes and Tsunamis. Due to its populated coasts, the possibility to record damage and casualties provided one of the longest historical tsunami and earthquake catalogues in the world. In this paper, we investigate the coastal deposits to find high energy deposits and correlate them to the historical tsunami catalogue to provide insights into the size and impact of such events. The results showed a strong impact of large tsunamis from the Hellenic arc on the Egyptian coasts but also left open questions about possible unrecorded tsunamis in our historical catalogues.
Anthropogenic noise pollution may mask natural sounds, which are fundamental to survival and reproduction of wildlife, especially for marine cetaceans as they are highly dependent on underwater sounds for basic life functions. In the 21st century, shipping in the ocean has increased significantly and causes low frequency (10–100 Hz) noise which affects or hinders vital communication of large baleen whales at 15 to 30 Hz. Noise in the ocean has been monitored as a byproduct by International Monitoring System stations of the CTBTO. However, elsewhere for example at ocean gateways or in marginal seas little is known about the soundscape. Here, we report long term and short term low frequency noise measurements from Gibraltar, the gateway into the Mediterranean Sea and from the Pelagos Sanctuary, a Marine Protected Area, in the Ligurian Sea, Mediterranean. Ambient noise is derived from calibrated moored ocean-bottom-hydrophones deployed for earthquake monitoring and seismic campaign work. Observations are compared to noise levels in the range of 1 to 100 Hz as revealed at CTBTO monitoring sites in the Atlantic, Indian and Pacific Ocean. Most profoundly, noise levels in the Mediterranean and near Gibraltar are significantly higher by up to 20 dB at 40 Hz when compared to the open oceans.
The Arctic Ocean is rapidly warming. The hydroacoustic environment will be impacted by the changing thermohaline structure, increased marine traffic, changes in sea ice coverage, and possible increases in microseism/storm noise. This will inevitably lead to obsolescence for today’s ocean acoustic models. The only sophisticated way to make predictions is using decadal to centennial integrations of fully coupled Earth system models (ESMs). At the Los Alamos National Laboratory we are working to create the first version of an ESM, the Department of Energy’s Energy Exascale Earth System Model (E3SM), capable of driving an acoustics model in a rapidly evolving Arctic Ocean. Here we analyse existing E3SM simulations to investigate how well we currently capture water properties for the upper 1000 m of the Arctic water column, including changes that could impact acoustic wave propagation. We present preliminary results of model validation against temperature and salinity observations from ice-tethered ocean profilers. The goal of this effort is to provide boundary ocean conditions to the acoustic model, to enable quantification of the ocean acoustic implications of climate change as well as to create a climate aware atlas of global acoustic noise and propagation adjustments for non-proliferation signal detection.
Seismic sensors have traditionally been largely restricted to on-land installations, yet the oceans comprise roughly 70% of the Earth's surface. Coupled with heterogeneous distribution of seismicity, this results in many regions being poorly sampled for Earth model development, and poorly monitored for detection of natural and anthropogenic seismic sources. To mitigate this deficiency, extending our seismic measurements and observations into the oceans is an important future direction in seismology. Sea floor seismic sensing may mitigate many monitoring issues from a geometric perspective, but the ambient noise level recorded by ocean-bottom seismometers (OBS) is in general much higher than the ambient noise levels on land. Details of the sea floor noise regime are not well described, and to exploit the broadest regions for potential instrumentation, we must characterize the noise environment to aid in selection of sites for permanent sensors. We present a preliminary comparison of seafloor seismic noise at several temporary sea floor deployments, located in different parts of the global ocean. We detail how noise varies as a function of oceanographic, meteorological, and other dynamic variables at these sites. Our aim is to eventually develop a worldwide dynamic sea floor seismic noise model to guide us in optimizing future deployments for seismic monitoring.
Recently, some efforts have been made to apply the Boundary Element Method (BEM) to propagation problems in shallow waters, motivated by its neat mathematical formulation and the fact that, unlike other 3-D methods, only 2-D boundary integration is required. However, its chances of becoming a real competitor in the area are undermined by the large sizes of the discretized computational boundaries and by the inhomogeneities in the wave equation resulting from the non-uniform sound speeds that any realistic ocean description requires. Here, we present and discuss results concerning a BEM formulation aimed at predicting sound propagation in an ocean environment of arbitrary bathymetry. A marching scheme is considered so that only a fraction of the total computational domain (the bathymetry) needs to be handled at once thus providing an affordable way to tackle long range propagation problems. The parameters required for an appropriate domain partition crucially depend on the features of the bathymetry, involving in general an irregular step size. Additionally, a set of theoretical integral equations are presented that take into account the sound speed inhomogeneities by using surface and volume potentials. Finally, the numerical schemes obtained from this theoretical formulation are shown, as well as some implementation details.
Crustal attenuation structures obtained at high frequencies (>1 Hz) are important for seismic risk assessment and geodynamics studies in stable continents. However, it is difficult to infer attenuation in low seismicity regions using body and surface waves. In this study, we explore the potential of using seismic T phases to constrain the crustal attenuation. We analysed the characteristics of T waves recorded on the seismic array deployed in southern Africa. The converted station-side T-P and T-S phases were identified by analysing the waveform, travel time, polarization, and frequency-wavenumber features. The distinct differences in polarization and slowness are used to quantify contributions from T-P and T-S conversions. The inverted frequency dependent Qp and Qs (attenuation factor of local P and S waves, respectively) in the southern Africa are found to be 204f^1.48 and 685f^0.53, respectively. Our method could be extended to infer crustal attenuation features near the coasts of other continents.
International Monitoring System hydrophone stations are useful for the monitoring of submarine volcanic activity, especially in the western Pacific Ocean. Therefore, we started to routinely analyse the signals of triplet H11S, located off the coast of Wake Island, from July 2022 onwards. The SeedLink server of the Incorporated Research Institutions for Seismology (IRIS) provides the data; the SeisComP5 software is used for access and storage in daily mseed files. We process one-day hydrophone data as follows: 1. Removing the instrument response, 2. Band-pass filtering of the waveforms (4-8 Hz), 3. Semblance analysis to determine the incoming direction and apparent velocity across the triplet per 10-s time window, using intervals of 1° and 0.002 km/s (1.45 to 1.55 km/s), respectively, 4. The results are posted on our internal website. The obtained maximum semblance values typically vary from 0.5 to 0.9, and only values ≥ 0.7 are interpreted to be significant.
Surface ocean currents are important maritime weather parameters because they influence both human activity and the global climate. In Indonesia, real time observations of surface ocean currents are currently made using HF-Radar installed in two locations, one of which is the Flores Sea. Because observational data is still scarce, efforts to provide surface ocean current data are required. One of the techniques used is the use of Himawari-8 geostationary satellite data on Sea Surface Temperature (SST). Particle Image Velocimetry (PIV), which is based on the cross-correlation technique, was used to calculate surface ocean currents from Himawari-8 SST data. The short term analysis of Himawari-8 SST in the Flores Sea during the northern summer shows
a high SST with values reaching 31 ̊C, especially around the north coast of Flores Island. Meanwhile, the south coast of Flores Island shows a low SST with a value of less than 26 ̊C. The direction of movement of surface ocean currents in the Flores Sea varies with the dominant direction towards the West according to the synoptic wind direction where the Australian monsoon occurs. Validation with HF radar data shows a similar pattern of ocean currents with a correlation value of up to 0.6.
We present a recent example for the civil and scientific application of International Monitoring System (IMS) hydroacoustic data. On 24 April 2022, IMS stations detected a cluster of impulsive, seismoacoustic events in the South Sandwich Arc, a volcanically active chain of remote islands and seamounts in the southern Atlantic Ocean. Preliminary results from automated and interactive analysis of hydrophone and T phase station data suggest that these events, which occurred over a period of less than two hours and which were preceded by at least two hour-long episodes of low level tremor, are likely volcanic in nature, and originated at or near one of several recently discovered seamounts offshore Candlemas and Saunders Island. Our findings are consistent with waveform data gathered by non-IMS stations as well as satellite observations and further highlight the potential of the IMS hydroacoustic network for detecting and studying submarine volcanic activity in some of the most remote regions on the planet.
In support of the Comprehensive Nuclear-Test-Ban Treaty, the International Monitoring System (IMS) has implemented a set of deep water open ocean hydroacoustic stations for monitoring (detecting and localization) any nuclear tests. As acoustic propagation satisfies the acoustic wave equation sound is subject to three-dimensional effects (refraction, diffraction, reflections) when in the presence of horizontal gradients due to bathymetry, oceanography or the presence of continents. The current processing system ignores these effects (using in-plane propagation for all acoustic paths) and has large error bars in azimuth for event association. In this work, a set of acoustic propagation codes are being used to integrate 3-D propagation into the automated event localization algorithm as well as into the IMS analyst work flow. A summary of the acoustic models as well as demonstrations of observed 3-D phenomenon for large underwater events will be presented as well as a programme plan to update the IMS processing approach.
Time-space variations of infrasound source locations from 2019 to 2021 were studied using a combination of two local arrays in the Lützow-Holm Bay (LHB), Antarctica. The local arrays deployed at two outcrops clearly detected temporal variations in frequency content as well as propagating directions during the three years. A large number of infrasound sources were detected and many of them located in N-NW directions from the arrays. These source events were generated within the Southern Indian Ocean to the northern part of LHB with frequency content of a few seconds; that is the microbaroms from oceanic swells. From austral summer to fall season, many infrasound sources orientation are determined to be north-eastward direction. These sources might be related to the effect of katabatic winds of the continental coastal area. Furthermore, several sporadic infrasound events during winter seasons had predominant frequency content of a few Hz, which are clearly higher than microbaroms. On the basis of a comparison with sea-ice and glacier distribution form MODIS satellite images, these high frequency sources were considered to be cryoseismic signals associated with cryosphere dynamics. In this regard, infrasound could be a useful tool to monitor surface environment involving climate change in the coastal area of Antarctica.
Aerodynamic infrasonic signals generated by large wind turbines can be detected by highly sensitive micro-barometers showing spectral peaks at the blade passing harmonics, which are above the background noise level. As infrasound is one of the four verification technologies for the compliances with the Comprehensive Nuclear-Test-Ban Treaty, decreases in detection capability for dedicated infrasound arrays have to be avoided. Therefore, for BGR preventing such decrease is particularly important for IS26 and IS27, which are International Monitoring Stations infrasound arrays and have to meet stringent specifications with respect to their infrasonic ambient noise levels.
In 2004, infrasonic signatures of a single horizontal-axis wind turbine were measured during a field experiment. As one of the results, a minimum distance to wind turbines for undisturbed recording conditions at infrasound array IS26 was estimated based on numerical modelling, validated with this dataset. Nevertheless, for broadening the dataset further infrasound measurements at two wind parks with modern large wind turbines have recently been carried out. Here various instruments (micro-barometers, microphones, pressure sensors) have been deployed in a comparative manner. An overview of these campaigns is given followed by first results of analysis and interpretation.
Simulations of a 1 to 2 km comet striking Earth on solid ground and on ocean waters have been conducted. Models include hydrostatic equilibrium profiles of temperature, pressure, and densities for atmosphere and ocean, in order to accurately predict impact consequences. Phase changes, such as melting and vaporization, combined with the material’s tensile, shear, and compressive strength responses, under high temperature and high strain conditions, along with anisotropic responses, are considered in greater detail than in previous work. Our simulations illustrate the erosion of the comet and the creation of a dusty tail, the breakup of the comet into fragments, the erosion of some of the fragments, and the survival of others. The reentry of fragments into the stratosphere and their flattening is also demonstrated. For the first time, we present results of an integrated simulation of water and ground impact, the resulting crater formation, fireball evolution and cloud generation and transport of dust and debris to hundreds of kilometers from the impact site. We illustrate the seismoacoustic signature of an asteroid impacting Earth. Computations were conducted on LLNL HPC using SME++ framework developed over the last 7 years.
The Hunga Tonga - Hunga Ha’apai (HTHH) eruption of 15 January 2022 was an exceptional event by the period, magnitude, and duration of propagation of the atmospheric waves it generated, circling the globe multiple times. Even though several volcanic eruptions in the past 150 years era of scientific instrumentation generated notable barometric disturbances, the HTHH eruption is comparable only to the Krakatoa eruption of 1883 by the magnitude of the atmospheric pressure waves that it generated. The very energetic Mt Pinatubo eruption of 1991 did not produce pressure waves of the same period and magnitude as the Krakatoa or HTHH eruptions. An analysis of the timing of the multiple passes at barometric stations is reported in Symons (1888) for the Krakatoa. Since the HTHH event gave rise to the only pressure wave to have circled the Earth’s atmosphere multiple times in the last 139 years, it is of interest to perform similar timing statistics on the multiple passages at stations that recorded them. A review of the Krakatoa analysis and a comparison with the HTHH are presented, with possible implications for the changes in the global state of the atmosphere during the interval between the two events.
The entry meteoroids and meteorites into the earth's atmosphere is a powerful source of infrasonic waves. The generated infrasound can be recorded at the ground and, using an array of sensors, characterized in terms of wave parameters, indicative of the source and its position. This study presents the integrated analysis of ~15 small fireball events based both on images of the all-sky cameras of the PRISMA meteor surveillance network and on the infrasonic data recorded at two infrasonic arrays deployed by the University of Florence in Northern and Central Italy. Thanks to the analysis of the light signals recorded by multiple all-sky cameras, the detected events were physically characterized in terms of time of occurrence, observed optical trajectory and pre-atmospheric speed, mass, dimensions and kinetic energy. Using the occurrence time provided by the all-sky cameras, the recorded signals, analysed through array processing, are back-propagated using ray-tracing techniques to reconstruct the infrasound propagation, locate the fireball event and reconstruct its trajectory.
Finally, the optically derived parameters are compared with the infrasonic amplitude retrieved at the source and with the frequency content of the recorded infrasonic signals, to investigate the potential of using infrasound to reconstruct the energy of the meteoroid events.
The earthquake swarm accompanying the January 2022 Hunga Tonga-Hunga Ha’apai (HTHH) volcanic eruption includes many post-eruptive moderate magnitude seismic events and presents the unique opportunity to use remote monitoring methods to characterize and compare seismic activity to other historical caldera-forming eruptions. We compute improved epicentroid locations, magnitudes, and regional moment tensors of seismic events from this earthquake swarm using regional to teleseismic surface wave cross correlation and waveform modeling. Precise relative locations of 91 seismic events derived from 59047 intermediate period Rayleigh and Love wave cross correlation measurements collapse into a small area surrounding the volcano and exhibit a southeastern time dependent migration. Regional moment tensors and observed waveforms suggest that these events are a similar mechanism and exhibit a strong positive CLVD component. Precise relative magnitudes agree with regional moment tensor MW estimates, while also showing that event sizes and frequency increase during the days after the eruption, followed by a period of several weeks of less frequent seismicity of a similar size. Our analysis of the HTHH eruption sequence demonstrates the value of potentially utilizing teleseismic surface wave cross correlation and waveform modeling methods to assist in the detailed analysis of remote volcanic eruption sequences.
The hydroacoustic International Monitoring System (IMS) network was designed to detect underwater nuclear explosions. Two types of stations belong to this network. Six of them record signals with hydrophone triplets placed in the Sound Fixing and Ranging (SOFAR) channel. Remaining five are T phase, seismic stations which detect hydroacoustic signal converted to seismic wave at a steep shore slope. T phase stations measure arrival time but, unlike in case of hydrophone stations, do not provide other detection parameters (i.e. azimuth estimate). Analysts consider correlation with signals at other stations or observations for similar events to include T phase station detections in event solutions. Hydroacoustic signals generated by large magnitude seismic events are also observed at coastal seismic or infrasound IMS stations, which are configured to measure detection parameters, i.e. azimuth, or slowness. Correct identification of such signals will help analysts to include them in International Data Centre bulletins. Investigation of these additional hydroacoustic data will also allow to estimate whether detection parameters can increase confidence in correct associations of T phase station detections.
Wind turbines emit vibrations due to the rotation of the blades and the movement of the tower. Vibrations radiated from wind turbines are known to interfere with operational seismoacoustic monitoring of natural and induced seismicity. Additionally, the contribution of such vibrations to the ambient seismoacoustic noise field can significantly hinder the performance of sensitive optical systems. One such example is the Einstein Telescope, a subsurface gravitational-wave detector, currently under development. The sensitivity of the Einstein Telescope is strongly affected by the ambient seismoacoustic noise field. To characterize the noise generated by wind turbines in the region of interest, we deployed five seismic mini-arrays west of the Aachen wind park and recorded the ambient seismic field over the course of 35 days. In addition to analysing the power spectral density at each station, we employ array-processing techniques to identify and characterize various sources in the region. Our analysis indicates that the amplitude of distinct spectral peaks decreases as a function of distance from the wind park. Additionally, we found that the amplitude, of the entire spectrum, but specifically of these spectral peaks is in correlation with wind speed.
The PUFF Model is a volcanic ash dispersion model used in Indonesia to help predict the distribution of volcanic ash for aviation safety purposes. This model uses the Lagrangian method taking into account wind, diffusion, and gravity parameters. The eruption of Mount Tonga on 15 January 2022 with a volcanic explosivity index (VEI) of 5 which exceeded Galunggung 1982 and Kelud 2014 is an interesting phenomenon to study with the PUFF model because of the relatively large distribution of ash eruptions. Volcanic eruptions with VEI >= 3 have different characteristics from VEI <3 which often occur in Indonesia. The simulation is carried out by changing the value of the diffusion parameter in the model and comparing it with satellite imagery to find a suitable value for a large type of eruption. It was found that increasing the value of the diffusion coefficient can increase the accuracy of the distribution of volcanic ash. The accuracy of the volcanic ash distribution area is very significant for flight safety.
The eruption of the Hunga-Tonga-Hunga-Ha’apai volcano on 15 January 2022 was the largest recorded since the eruption of Krakatoa in 1883. The eruption triggered tsunami waves of up to 15m which struck the west coast of Tongatapu, ‘Eua and Ha’apai.
In this work we analyse data of this event. With a magnitude of mb 4.2 at 04:14:59 UTC, the eruption was detected by the three International Monitoring System (IMS) technologies. This work includes data analysis with the HA11 and HA3 hydroacoustic stations with the integration of the location of the event with the seismic and infrasound data of the stations close to the event. In addition we analyse signals related to the Mauna Loa eruption from infrasound, seismic and hydroacoustic IMS stations, which detected an event located in Hawaii, USA, 28 November 2022 at 08:56 UTC.
Understanding and predicting seismoacoustic waveforms may be a real conundrum, depending on the complexity of the source term, but also on the geological features a wavefield is propagating through. Transmission of elastic energy (i.e. coupling) between the solid Earth and fluid layers (atmosphere, ocean) can bring a wealth of information about the characteristics of a source, but may be highly dependent on the geological settings (topography). To overcome these challenges, two numerical tools are presented that can deal efficiently with the propagation of elastic wavefields in complex media. The first is an extension of the now classical finite spectral-element method (SEM) to the so-called hybridizable discontinuous Galerkin (HDG) model. The HDG-formulation allows to mitigate meshing constraints inherent to the SEM at a reasonable numerical cost, by introducing hybrid and nonconforming meshes. Also implemented in the HDG software, the second tool goes further, in the sense that without any loss of numerical convergence, material interfaces need not fit element boundaries, which is the classical paradigm in finite element methods; one coins these, unfitted or cut (finite) elements. Both tools are applied to simulate the accidental Beirut explosion and help to understand seismic conversions in the near-field of the source and receivers.
One of the sources of acoustic signals is underwater volcanic eruptions. Climatic phase of such underwater eruptions results in an ash cloud being ejected into the stratosphere thus generating infrasound signals while at the same time generating acoustic signals. The active volcano in Hunga Tonga-Hunga Ha'apai (HTHH) one of the Tonga Island groups erupted violently on 15 January 2022 triggering tsunamis. International Monitoring System (IMS) infrasound and hydroacoustic stations recorded the event from the foreshock to aftershock. Submarine and subaerial components were identified in the study of acoustic signals obtained from these IMS stations. A complex source sequence was detected by the stations analysed during the eruption phase. DTK-GPMCC was used to study the wave properties with the time frequency bands at lower frequency (0.001-1Hz). The wave parameters of back-azimuth, frequency, the root-mean-square amplitude, and the apparent velocity were calculated. Result of the event showed acoustic gravity, Lamb wave and infrasonic features. This study is an indication of the importance of IMS stations towards the understanding of submarine volcanic eruptions.
The 15 January 2022 eruption of the Hunga volcano (Tonga) generated a rich spectrum of waves, some of which achieved global propagation. Among numerous platforms monitoring the event, two stratospheric balloons flying over the tropical Pacific provided unique observations of infrasonic wave arrivals, detecting five complete revolutions. Combined with ground measurements from the infrasound network of the International Monitoring System, balloon-borne observations may provide additional constraint on the scenario of the eruption, as suggested by the correlation between bursts of acoustic wave emission and peaks of maximum volcanic plume top height. Balloon records also highlight previously unobserved long range propagation of infrasound modes and their dispersion patterns. A comparison between ground and balloon based measurements emphasizes superior signal to noise ratios onboard the balloons and further demonstrates their potential for infrasound studies.
There is a heated debate in the scientific community about possible effects of changes on the planet's surface due to global warming, frequency and magnitude of earthquakes, and tsunamis triggered by earthquakes. Some studies point to seasonal modulations of deep slow slip and earthquakes in the main thrust of the Himalayas or increased risks at coastlines. One aspect that everyone seems to agree on is the instantaneity and level of destruction. Adding to the danger is that each State is responsible for issuing warnings to its population through National-Tsunami-Warning-Centres or similar authorities. The decision to issue an alert is based on its own analysis of the situation. Along with earthquake alerts, CTBTO has provided valuable tools and knowledge to improve research and development content in this area.
This study evaluates the advances of the States that are affected by earthquakes, that have coasts in the Caribbean, that could be negatively affected by the foreseeable actions of Climate Change, to guide the technological prospective actions in the creation of their own alert centres for climate change, earthquakes, tsunamis and the interaction of these as a large network that might be led by CTBTO.
Indonesia is a relatively high seismic activity region and one of the most active areas is the West Papua. TOAST is a software used for tsunami modeling simulations that can provide results quickly and in real time. Analysis of the impact of the tsunami using TOAST shows that based on the earthquake scenario on 10 October 2002 for the magnitudes varying from 7.0 to 8.0, the highest tsunami wave was recorded in the northern Sorong area with a height from 0.066 to 3.345 meters. Meanwhile, with a magnitude from 8.5 to 9.0, a tsunami was recorded in the Manokwari area with a height from 1.976 to 2.906 meters. The fastest estimated time arrival (ETA) was recorded in the northern Sorong area with an interval of about 3 minutes. For the earthquake scenario on 3 January 2009 with a magnitude between 7.0 to 8.5, the highest run-up was recorded in the Wondama Bay area with a height from 0.178 to 5.546 meters. For a magnitude of 9.0, it was recorded in Manokwari with a wave height of 2.906 meters and the fastest ETA was recorded in the Wondama Bay area with an interval from 8.25 to 8.5 minutes.
The activity of the subduction zone in the south of Java island has been generating severe earthquakes and tsunami events in the past. This natural disaster made the southern region of Java, including the Yogyakarta special region (DIY), prone to incoming tsunamis. Kalurahan Glagah, a village in the south part of DIY, is one of 1,013 villages in Indonesia that has a high tsunami vulnerability, so it is necessary to prepare its community for a tsunami that can occur at any time.
We have assisted the local government in increasing the preparedness of its people in dealing with the tsunami. Through the capacity building program such as Earthquake Field School and providing detailed tsunami hazard maps using the worst-case scenario. Based on numerical modeling results, we identify that the Mw8.8 earthquake can generate a tsunami wave as high as 22 m with a wave arrival time of 38 minutes on the coastline of DIY. The existence of Yogyakarta International Airport in Glagah Village as an earthquake-resistant structure means that this airport can be utilized as a vertical evacuation place. The synergy between stakeholders has received national recognition from Indonesia's National Tsunami Ready Board and seeking international recognition by UNESCO Intergovernmental Oceanographic Commission.
The Instituto Geofísico of the Escuela Politécnica Nacional (IGEPN) is in charge of the monitoring and study of seismic and volcanic activity in the Ecuadorean territory. The networks include monitoring seismic, volcanic and geodetic networks with 105 seismic stations, 11 infrasound sensors, 67 strong motion sensors, 65 GPS of high accuracy and more than 56 stations for lahar detection, gas analysis and visual monitoring in active volcanoes; all of them with real time transmission provided by different and independent technologies: satellite, microwave owned by IGEPN, fiber-optic with 16 nodes in different places in Ecuador, Internet and own subnets with analog or digital radios, which represent 90% of all instrumentation installed in Ecuador. This maintains the National Processing Center for Issuing Seismic and Volcanic Alerts, which operates 24/7 and supplies the information by email, fax, radio, Telegram, Facebook, Twitter and website automatically three minutes after seismic event occurs. This article describes actual network coverage, performance, evaluation and optimization of real time transmission network, with analysis from 2016 to present.
Scientific data shows that glaciers are melting due to Climate Change driven by global warming with some results showing an impact in the amount of seismic activity. If glaciers thawed, the enormous weight bearing down on the Earth's crust is reduced, experiencing what geologists refer to as a “post-glacial rebound." This process might reactivate faults and lift pressure on the subterranean reservoirs containing molten silicate fluid beneath a volcano in subduction zones an oceanic plate moves under another plate, therefore increasing seismic activity. Using correlation coefficient and regression analysis based on two previous case studies, this paper explores the relationship between temperature rise due to global warming and earthquake frequency in the Carribean Basin.
Past earthquakes in the Banggai area were not widely reported. Lack of information on the source mechanism and the tsunami made it difficult to obtain a complete picture of the disaster events. The earthquake and tsunami on May 4, 2000 is one example. The tsunami earthquake that occurred, although very local in nature, had quite an impact in some locations. However, generating mechanism correlated with earthquake sources, propagation and affected areas is not sufficient and raises a number of questions.
Using historical global seismic data, we identify the characteristics of the earthquake's source and propagation in the Banggai region. Investigation of the source mechanism's characteristics also revealed that the tsunami was generated by sources other than tectonic earthquakes, such as a submarine landslide. This indication is reinforced by the result of field surveys that show non-uniformity of inundation in the affected areas. This study of the historical earthquake and tsunami impact is critical to determine disaster impact in the future. It serves as an input and reference for the local government's disaster mitigation efforts.
Bangladesh, a major part of the Bengal Basin, is an earthquake prone country due to its location at the junction of two major tectonic plates. The complicated geology of this basin is responsible for occurring several major and minor earthquakes in Bangladesh and its adjacent areas. The consequence of the earthquake and other natural disasters would be devastating for the country and its surplus population, and faulty and unplanned infrastructures. The Department of Environmental Science and Disaster Management of Daffodil International University has the scope of initiating a wide range of academic research covering such geological, environmental and disaster related courses. Seismic data analysis to obtain information on crust-mantle boundary or deep geological layering in this area could be one of the important aspects among this research. The department has strong collaboration with the National Data Centre of Bangladesh under CTBTO through which such research can be implemented. Seismic travel time data, radionuclide distribution, and hydroacoustic data towards tsunami early warning system, climate change investigation etc. from the virtual Data Exploitation Centre (vDEC) could be utilized through academic activities of various students, who are the representatives of youth in Bangladesh. This will contribute to improve the disaster preparedness systems of the country.
IDC data of radioactivity concentration of natural radionuclides in particulate matter collected at THP65 station have been studied their behaviors over a year. A significant difference of radioactivity concentrations of Pb-212F and Be-7 between wet and dry seasons has been found. North-east and south-west monsoons occurring on dry and wet seasons have been discussed their impacts on level of radionuclides. It is concluded that the air mass, including associated particular matter, originating over the Asian landmass is directed into Thailand by the north-east monsoon in the dry season and moist air from the Gulf brought into Thailand in the wet season are the reasons why radioactivity concentrations of Pb-212F and Be-7 in particular matter at the THP65 are higher and lower during the dry and wet seasons.
The location of the Northern Caucasus Geophysical Observatory of the Institute of Physics of the Earth, Russian Academy of Sciences (IPE RAS) in the immediate vicinity of the magma chamber of the Elbrus volcano makes it possible to obtain unique data on the structure and dynamics of the thermal field in its vicinity. A precision temperature observation system was developed at the IPE RAS some years ago. During this time, we received first results of observations of natural temperature variations with an accuracy of up to one thousandth of a degree. The observed diurnal and semidiurnal harmonics found in microvariations of the underground temperature may be associated with the convective component of heat and mass transfer, which is largely determined by the corresponding changes in the regime of regular fluid migration due to the periodic influence of the combined solar-lunar tide effects on the geophysical medium in the deep underground tunnel. Estimation of the contributions of the conductive and convective components to the heat flow will make it possible to draw conclusions about the dynamics of the fluid-magmatic system of the Elbrus volcanic center and study the mode of its functioning, and can also be used directly for monitoring volcanic hazard.
On January 4, 2018, two earthquakes reported by the population were recorded in the northern region of Guatemala. These were characterized with the National Seismological Network (RSN), however another thirty-nine earthquakes of smaller magnitude could only be recorded by the Auxiliary Seismic Station APG (AS-037) due to the high standards that its facility meets for its use in monitoring of possible nuclear explosions.
This paper describes the use of APG (AS-037) for the characterization of these earthquakes (arrival times seismic waves, epicentral distances, magnitude and possible location) and their possible association with local geological faults, using techniques for the description of a seismic sequence using a single seismic station. These results provided important information for the subsequent proposal to improve the RSN in the region.
Currently, Instituto Nacional de Sismología, Vulcanologia, Meteorologia e Hidrología
(INSIVUMEH) has expanded its network of seismological stations, of which APG has been integrated into the permanent monitoring routines, being one of those with the highest detectability of both local, regional and distant earthquakes. In addition, it has the cooperation of seismological networks of neighboring countries. Consequently, the seismic catalog in has increased the number of recorded earthquakes, with a better quality of information and knowledge of seismic hazard.
Data collected by the International Monitoring System (IMS) represent four complementary technologies: seismic, hydroacoustic, infrasound and radionuclide. The main objective of acquiring IMS data is to detect and identify nuclear explosions on land, in the oceans, and in the atmosphere. However, the data are also extremely valuable for scientific studies e.g., to investigate the migration pattern and population density of marine mammals, or to track radiation on a global scale. These applications provide a benefit to the main purpose of detecting nuclear explosions by improving monitoring algorithms and our ability to discriminate between explosion signals and what is considered noise from the CTBT perspective. The vast amount of IMS data archived over two decades can be accessed via the virtual Data Exploitation Centre (vDEC). After signing a zero-cost vDEC contract (which contains legal requirements), scientists and researchers from many different disciplines and from around the globe, can get access to the IMS data to conduct research and to publish new findings. This presentation provides insights related to the vDEC platform and gives a statistical overview of all past and current scientific applications, including several examples of topics addressed with IMS data accessed under vDEC contracts.
The International Monitoring System (IMS), installed and maintained by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) with the support of States Signatories, is a global system of monitoring stations composed of four complementary technologies: seismic, hydroacoustic, infrasound and radionuclide. One of the IMS radionuclide stations is located in Spitzbergen, a place where, based on many lines of evidence, the signs of climate change are noticeable. Spitzbergen is the largest island of the Norwegian Svalbard archipelago, in the Arctic Ocean. Warming of the atmosphere and oceans has been accompanied by sea level rise and a strong decline in Arctic Sea ice. Among other natural radionuclides, Pb-212 is always observed at IMS stations, in varying quantities, including at the Spitzbergen station RN49. The increase of surface temperature has an influence on the amount of Pb-212 released from the ground surface. This study based on almost twenty years of measurements (2003-2022) by the CTBTO demonstrates that variations of Pb-212 activity concentrations may be connected to climate variations.
In the months following the Fukushima Daiichi nuclear power plant accident, the IMS radionuclide stations observed elevated concentrations of anthropogenic radionuclides throughout the northern hemisphere. These data are available for scientific research through the virtual Data Exploitation Centre (vDEC) after signing a cost-free confidentiality agreement with the CTBTO. Part of these radionuclide concentrations were given to UNSCEAR for inclusion in their 2013 report. These data are highlighted in its scientific Annex on “Levels and effects of radiation exposure due to the nuclear accident after the 2011 great east-Japan earthquake and tsunami” as being unique in terms of the global coverage and the broader range of radionuclides than reported elsewhere, including four radioxenon isotopes. This presentation reviews the use of these data in dozens of peer-reviewed scientific papers. The most frequent application is the reconstruction of the source term for 137Cs, 131I, and 133Xe. Further applications include studies of wet deposition rates, enhancements for atmospheric transport modelling methods, reconstruction of release patterns, assessment of reactor damage, and environmental impact.
Myanmar is exposed to multiple natural disasters including cyclones, earthquakes, tsunamis etc. Myanmar has signed tsunami warning agreements with CTBTO. Myanmar can access IMS database and IDC products. These data together with national natural disaster monitoring means will be used to identify disaster prone areas of the country. Creating public knowledge about disaster prone areas and the level of vulnerability to lives and properties in disaster events is key in disaster safety planning. If there is any nuclear accident or test in neighboring countries, Myanmar has activities for environmental radiological monitoring. To enhance the quality and coherence of decision making following nuclear emergency, working group of Myanmar has participated in JRODOS (Real-time On-line Decision Support System) exercises from EU. Myanmar has proposed locations of MMC, GDRMS and SWDRMS around the country. After the Fukushima incident occurred, IDC’s radiation database has been used to inform local people on the distribution of radiation in the atmosphere. Myanmar has also participated in the ConvEx exercises from IAEA and is a member of Establishing a Regional Early Warning Radiation Monitoring Network and Data Exchange Platform in ASEAN to mitigate hazard and to save lives from natural disasters.
With support from the U.S. Department of Energy, Lawrence Livermore National Laboratory and the Incorporated Research Institutions for Seismology are collaborating with seismic monitoring centers in the Caucasus and Central Asia to expand national seismic networks through the installation of permanent broadband seismic stations. The main goal of the project is to improve regional network coverage by making high quality data from new stations openly available to the global scientific community. The multi-year project involves deployment of more than fifty posthole broad-band sensors across six participating countries: Armenia, Azerbaijan, Georgia, Kazakhstan, Kyrgyzstan, and Tajikistan. Additionally semi broad-band stations will be deployed for local monitoring of volcano and mud-volcano sites in Armenia and Azerbaijan; and standalone urban type strong motion sensors will be deployed in large cities. To facilitate data exchange and incorporate diverse sources of real-time data into national monitoring solutions, the project also supports the installation of high-capacity servers for participating monitoring centers. Station deployments began in the summer of 2021 and are planned to be completed in 2023. The project is implemented through the Seismic Targeted Initiative of the International Science and Technology Center in Kazakhstan and the Science and Technology Center in Ukraine.
Detecting and notifying ongoing volcanic eruptions is crucial in supporting the Volcanic Ash Advisory Centres. However, local monitoring systems are missing at many active volcanoes. Long range infrasound monitoring, potentially able to detect and notify volcanic explosive events, might provide useful information. Indeed, many studies have already highlighted the utility of long range infrasound for this aim, but still open questions remain concerning its actual efficiency and reliability.
In this study we investigate the potential of International Monitoring System infrasound network of CTBTO to remotely detect volcanic explosive eruptions, focusing on the main active volcanic area in the world, between 2010 and 2019, where multiple eruptions occurred, with an energy spanning from mild explosions to Volcanic Explosivity Index ≥ 4 eruptions.
We applied a detection algorithm developed at local distance and extended for long range volcano infrasound monitoring. To assess the reliability of the algorithm, based on the Infrasound Parameter (IP), we compare the notifications issued with the bulletins reports of GVP (Global Volcanism Program). Although unresolved ambiguity remains due to short spacing among volcanoes with respect to the array and the unfavourable propagation conditions, the algorithm is able to detect in quasi real time the ongoing activity with high notification reliability.
This is the first study after the signature of the tsunami warning agreement between the CTBTO and Madagascar in 2019. On 2nd of August 2019 12:03:23 UTC, an earthquake of magnitude 7 occurred south-west of Sumatra Indonesia and generated a local tsunami warning. The goal of this project is to monitor tsunami events using International Monitoring System seismic stations, then to simulate an inundation effect on the east coast of Madagascar with Community Model Interface for Tsunami (ComMIT) software developed by NOAA Center for Tsunami Research (NCTR). This event was observed from seismic and hydroacoustic stations as defined in “Hydroacoustic data analysis before and after the pandemic plus its contribution to Tsunami warning system in the Indian Ocean”. As a result, wave arrival time was 12:09 UTC at CMAR seismic station with 21 minutes of signal duration. Then, inundation effect was shown in some area with an average of 1m ocean wave amplitude.
Volcanic ash cause airports to alter or close their operations when it was detected around the airport and the flight routes. Bali airport is one of the busiest airports in Indonesia. Unfortunately, it is surrounded by three active volcanoes which have recently erupted and spewed volcanic ash. Even though volcanic ash movements can be predicted, these events always have the effect of surprises by causing many flight delays or cancellations that lead to enormous economic losses. This occurred because of the lack of preparedness and mitigation plans in dealing with this threat. To overcome this problem, a volcanic ash risk map was generated by combining flight route data, 30 years of wind data in various elevations, rainfall data in a different season, and volcanic ash trajectory data from three surrounding volcanoes namely Mount Raung, Agung, and Rinjani. The result shows that during the easterly wind season the risk of volcanic hazard was higher since the two most active volcanoes (Mount Agung and Rinjani) are located in the eastern part of this airport and during this season there is less rainfall that could wash the ash away. Therefore, appropriate mitigation plans can be established as well as airport collaborative decision making
Everyday waveform data is detected and recorded by the seismic, hydroacoustic and infrasound stations (SHI) of the International Monitoring System produced by different sources coming from the earth, the oceans, and the atmosphere. In this study, signals of interest are the ones from volcanic sources, which can be put to wider civil and scientific use, from helping to save lives in case of major volcanic activity. Most volcanoes generate precursor signals before an eruption such as earthquakes, tremors, swarms, weak explosions, and landslides that SHI stations could record. A proper onset time detection, classification, and location of volcanic signals and with the support of an enormous catalog of event bulletins, crucial time can be gained in the prediction of a potential eruptive scenario and advice the authorities to evacuate the population before the intense lap of activity start. In this work, we will show some examples of SHI volcanic signals that can be analyzed interactively in an efficient way with the help of Geotool and DTK PMCC which are part of the NDC in a box software package, that available to Member States free of charge as part of the technical assistance through the capacity building and training.
Ice avalanches constitute severe natural hazards, threatening human lives and infrastructures, and are expected to increase with ongoing climate change and population pressure forcing settlements into exposed terrain. In Europe, costly monitoring programs have also highlighted changing glacial hazards. Consequently, monitoring and warning systems, which help mitigate the threat and impact of mass movements are a key component of hazard management in mountainous regions worldwide.
In this study we show how infrasound arrays are able to detect and locate ice collapses from a glacier front and under specific conditions might allow to provide an estimate of the source volume. This study focuses on small scale (< 10.000 m3) ice collapses as recorded by local (< 2 km far) arrays, and fluid-dynamics based estimates of ice volumes inferred from infrasound are compared with independent measurements to validate theoretical estimates and to define experimental relations.
These results show how infrasound array observations may provide successfully quantitative information of glacier collapses and ice avalanche volumes, thus opening new perspectives for monitoring avalanching glaciers and providing warning for break-off events.
The main objective of this project is to be able of identifying Paleo tsunamis and climate change. Using heavy metals deposits and calcium ratio, more specifics strontium. Our islands during time have been affected by a lot of earthquakes, leaving an enormous possibility of being affected by tsunamis, more of them, historically. Using strontium and calcium ratio we will be able to detect where have been places in our cost and season of drought, that have been affected by these kinds of events. All the coastal have been by the influence of tsunamis generating by subduction earthquakes. Since the formation of the island, leaving all of the island shores with sediments, organic materials and heavy metals, the study of this sediments, the detection of this metals. Geochronological dating using strontium and calcium ratio have been used before, for these purpose with a high level of confidence and a low values of standard deviations. This analysis will also leave us the possibility to review our risk analysis in terms of climate events and generate others research in terms of new climate policy form our governments.
Old fashion earthquake early warnings are changing in a new era in the shadow of Earthquake Preparedness Alert (EPA), which instead of noticing only a few seconds, issues a few days in advance. Still, EPAs need more efforts for being used in the public sector, but already they are used in oil and gas, mining in California and Nevada. Since September 2020, the generated models in project Earling analyze risk level changes in these two regions. Since then, all the larger than M5 earthquakes were detected a couple of days in advance. For example, Petrolia residents felt shaking of a M 6.2 earthquake on 20 Dec 2021. The earthquake rattled the region three days after the models detected a high risk time window for the region. Based on Federal Emergency Management Agency (FEMA) reports, these two regions experiencing about 85% of the 75,000 annual earthquakes in the US. So low false alert ratio in the regions is mandatory to make decisions based on seismic risk level changes. Whilst Earling marked five high risk time windows in 2021 it didn’t mark any time windows as high risk until the end of November 2022, which means precise risk detection both in avoiding false positive and false negative alerts.
The CTBT is considered to be a global instrument for the enhancement of international peace and security through the banning of nuclear weapons tests and constraining further development of existing nuclear weapons, on the other hand, the Treaty on the Non-Proliferation of Nuclear Weapons (NPT) was established to prevent the proliferation of nuclear science and technology in all its ramifications and advance the peaceful uses initiatives as well as address disarmament by nuclear weapons states. The two treaties are interconnected in that they both offer a foundation for eliminating the spread of nuclear weapons and providing the necessary tools for the promotion of global stability. This widespread benefits of peaceful uses of science are an enormous success of the NPT regime as well as the civil applications of the CTBT within the scope of the International Monitoring System data and International Data Centre products. This work looks at how the CTBT and the NPT are interconnected, particularly the CTBT's link to the decision in 1995 to indefinitely extend the NPT. The 2020 Review Conference on the NPT which took place in August 2022 provided a benchmark to support the entry into force of the Treaty, which is considered the cornerstone of non-proliferation.
Data from the IMS hydroacoustic, infrasound and seismic station networks empowers civil and scientific applications in climate knowledge generation and ocean monitoring activities across the world. Meanwhile, the value of science-informed, evidence-based policy making has gained wide recognition for providing robustness, clarity and practicality in driving the global sustainability agenda and management of our natural resources. With the objective of proposing partnerships that can advance entry into force of the CTBT, this project establishes the relevance of the CTBTO as a stakeholder in the UN Decade of Ocean Science ("Ocean Decade"), and how engagement and framework alignment can be envisioned and approached. This will be developed through efforts to 1) Capture the current range of relevant working programmes solicited by the Ocean Decade, 2) Identify pathways to engage and integrate interactions, and 3) Assess possibilities for a collective impact on security and sustainable development. The outcome of this project shows how the CTBTO's expertise in capacity building and interagency coordination can be distinguished and leveraged in this context.
Session is opened to the public as a first come first served basis until room capacity limit is reached.
The future of the CTBT verification system – opportunities for science and technology
- Presentation and discussion on gaps in science and technology
- Challenges - the NDC perspective
During this side-event we would like to discuss the future of the CTBT verification system. The development and implementation have been ongoing for 26 years, and the system is ninety per cent in place. Seen from a scientific and technological perspective, how do we envision the coming decades?
A presentation and panel discussion by YPN members and mentors will discuss the opportunities in science and technology to improve the operation and functioning of the verification system. This panel of younger expert will also contribute their views on challenges faced by the system seen from NDCs around the globe.
During last year a large volume of tasks was made in our high-precision tiltmeter’s instrumental complex installed in the underground geophysical laboratory of IPE RAS situated at deep adit in Andirci mountain not far from of the Elbrus volcano. The main scope of work was devoted both to improving the instrumental park of the laboratory and to carrying out a set of works on precision spatial orientation of geophysical measuring equipment on pedestals in the adit. Now the instrumental complex of laboratory consists of the main unique instruments (a two-axial high-precision tiltmeter and quartz tiltmeter) and the electronic 24 bits data acquisition, processing, and storage systems. There are installed round-the-clock monitoring of the parameters of the environment around the tiltmeters, including precision measurement of the temperature of the ambient air with a relative measurement accuracy of about 0.001 degrees Celsius, the pedestal and body of the tiltmeter, the measurement of humidity and atmospheric pressure. In addition, the highly sensitive and eye-safe LIDAR system is helped to monitor the aerosol emitted by cracks at the installation site of the complex. The achieved orientation of the actual position of the pedestals was made with an accuracy of ±00°02'.
Calibration of seismometers used in nuclear explosion monitoring systems, such as the International Monitoring System, is important for providing confidence in the measurements of ground motion and the resulting analysis that is performed on the waveform time series data. Six models of seismometers widely used at International Monitoring System stations have been calibrated using high precision vertical and horizontal shake tables that utilize metrologically traceable measurements of instrument motion using a laser vibrometer. The results of these ground motion calibrations are compared to the instruments’ nominal response provided by manufacturers and to their electrical calibration results which are obtained in a similar manner to in-field calibrations. This study focuses on the high frequency (up to 50Hz) performance of the instruments and offers insights into how the instruments perform.
SNL is managed and operated by NTESS under DOE NNSA contract DE-NA0003525
Distributed acoustic sensing (DAS) is experiencing an exponential growth. Although some difficulties in applying this technology to solve some problems of seismology have been encountered, however the main obstacle is insufficient metrological assurance. For each virtual sensor (channel), a lot more work is required in determining its positioning, orientation, transfer function and self-noise, etc. Part of the metrological characteristics can be estimated in advance if you know the type of cable, the depth and method of its laying, the type of soil and its compaction and local conditions. A library of publications on the influence of these parameters on the metrological characteristics is needed. For specifying other characteristics it is possible to develop simplified evaluation methods that, with sufficient accuracy, will allow the use of DAS. Currently, a lot of work is being carried out using DAS, its comparison with traditional seismic sensors and study of individual metrological characteristics. Each such study has an undoubted value, because it adds a new brick to the DAS. In this work, we describe some of the principles and considerations that we used in designing of the experiments.
VNIIA is a leading company of Rosatom State Atomic Energy Corporation in CTBT implementation, which currently performs a range of researches:
- develops a scientific and methodical support and hardware and software for CTBT on-site inspection (OSI) activities, provides a comprehensive assessment and analysis of efficiency of controls and data information content published by the International Data Centre (IDC);
- participates in the analysis of events which reveal a possible non-compliance with CTBT on the part of other States Parties, collects the geophysical and radionuclide information based on products of the IDC;
- develops software to process the geophysical information in the National Data Centre;
- improves the data analytics system to use it in applied researches for monitoring the CTBT compliance;
- investigates the ways to create an expert system for checking a procedure for CTBT OSI and for using it in applied researches in the framework of the Rosatom activities in the field of CTBT compliance monitoring in order to predict and evaluate OSI results;
– has finished the development of a short period vertical seismometer and three-component broadband seismometers for the seismic monitoring systems
Recent development in infrasound noise models (Marty et al, 2021) and progress in digitizers’ and microbarometers’ design allows the introduction of precise infrasound noise studies for new installation and for station upgrades. IMS/ED/SA has compiled a library of equipment self noise data and background infrasound noise data of IMS infrasound stations based on percentile calculations. The presented noise study charts show the results of the studies at multiple IMS stations. The outcome suggests revision of the IMS minimum requirements for infrasound station specifications in the part of microbarometer and system noise.
The spectral characteristics of seismic and infrasound noise were calculated by the waveforms of the NNC RK network stations using PQLX software. The calculation results were compared with the seismic noise model by Peterson and infrasound noise by Brown. The daily and seasonal noise variations were analysed. Special attention was paid to regularities of the level change at microseismic maximum. The observed peculiarities are well explained by the location and dynamics of microseism and microbarom sources located in the North Atlantics.
Two oceanic plates are converged offshore around the Japanese Islands, in which the Philippine Sea plate is subducting at the Nankai trough in southwest Japan. Historically, a mega-thrust earthquake is repeated every 100-150 years along the Nankai trough, and the last earthquake series occurred in the 1940s. For this reason, real time sea floor observatories for earthquake and tsunami monitoring, i.e. the DONETs were installed in 2010. The DONET is capable of adding new sensors with plugging in underwater connectors. Making use of this underwater technology, three borehole observatories, two different typed tilt meters, and one fibre-optic strain meter have been connected with the DONET before. In 2022, two fibre-optic strain meters were additionally installed at the same location of the existing fibre-optic strain meter, making it possible to compare to each other and reduce the ambient noise. In this presentation, we introduce the underwater technologies developed for adding new sensors. Such technologies include tools for thin fibre-optic cable extension, plugging in underwater connecters, etc. worked by a remotely operational vehicle (ROV). It has been shown that our in situ measurement can be performed based on the modular design sea floor observatory system and supported by advanced ROV operations.
In this research, a new optical sensor is used to measure the suspended mass displacement in the seismometer. In the last decades, scientists have used optical interferometer set-up to improve accuracy in displacement measurement. To integrate such optical measurement in a seismometer, all the optical functions, as beam splitter, are integrated in an optical substrate. The only remaining macroscopic elements are the mirror fixed on the suspended mass and the lens to focus the beam on the mirror. The intrinsic noise is below the seismic low noise model all over the bandwidth of interest. To correct the measures from optical defects of the lens or/and the mirror, a correction algorithm is used and allows to reach a coherence of one between the optical measurement and traditional sensors. The results show the accuracy and reliability of such optical sensor integrated in a mechanical seismometer.
Our recent research in an integrated optical interferometer has led to the development of a very low noise optical seismometer based on such transducer. The performances of the transducer combined with the appropriate technical choices for its integration leads to a seismometer which is able to measure subnanometrics earth displacements over a very wide bandwidth. Our optical seimometer has an intrinsic noise which is at least 10 dB below the seismic low noise model from 10-5 Hz up to 10 Hz. More generally, this seismometer benefits from more than 50 years of experience in geophysic sensors design at CEA.
Relative humidity sensor is one of automatic weather station (AWS) component. Based on annual quality control, relative humidity sensor data have approximately 7% of unavailability because of system maintenance in 2020. This study proposes design of relative humidity virtual sensor according to competitive sensing concept. It is simulated on three AWS in northern Middle Java Province of Indonesia, namely AWS Kandeman, Pemalang and Kajen from January to March 2021. These AWS are installed adjacently around northern highway of Java in triangular constellation. Virtual sensor is arranged based on data fusions from physical sensors using MLP and LSTM algorithm. Three previous delayed temperature and relative humidity sensor data is utilized as inputs. Data are then segmented into 70% training and 30% testing data. Virtual relative humidity sensor for AWS Kajen is accurately composed by combination of delayed AWS Kandeman and Kajen data using MLP with 1.38 %RH of RMSE. Virtual relative humidity sensor for AWS Pemalang is accurately composed by combination of delayed AWS Kandeman and Pemalang data using MLP with 2.04 %RH of RMSE. Virtual relative humidity sensor for AWS Kandeman is accurately composed by combination of delayed all of those AWS data using MLP with 2.04 %RH of RMSE.
Ambient seismic noise (ASN) is defined as small vibrations recorded throughout the Earth’s surface. The generating noise sources are classified into anthropogenic sources (<1 s) and natural sources (>1 s). In this study, we used data from the Romanian Seismic Network (RSN) stations, operated by the National Institute for Earth Physics (NIEP) and we analysed the influence of atmospheric parameters on the ASN level. A long term evolution of seismic noise emphasized a drop-off in the noise level in 1-2 s period band interval at most of RSN broadband stations in the second half of October 2019. We also observed a good correlation between the increase of the noise level and the increase of the wind speed for the 0.5-0.05 s period band interval. In this paper, the atmospheric data are provided by the NIEP’s weather stations collocated with seismic stations.
Decades ago new opportunities in seismology were opened by the development of broadband seismic sensors with feedback. The three defining characteristics of these instruments were the bandwidth extension to longer periods, a much lower intrinsic noise and a higher dynamic range. However, the goal of further extending their bandwidth to frequencies above 100 Hz has proven elusive because these sensors are plagued by parasitic resonances leading to modes not controllable by the feedback system. Here we present a new low noise seismic borehole sensor with a truly VBB flat response over five frequency decades from 2.7 mHz (360 sec) to 270 Hz. The instrument has no mechanical resonances below 400 Hz. We achieved the bandwidth extension to high frequencies with improvements of the mechanical design, i.e. the arrangement of the pivots and the geometry of the spring. The design is realized in a borehole arrangement, where three sensors are stacked in 90-degree angles to each other. Including a single jaw hole-lock as a clamping mechanism the complete stack has a diameter of 89 mm, is 625 mm long and weighs about 24.5 kg. We show test results from three co-located complete borehole sensors with identical frequency responses.
The traceable calibration of seismometers is research work within the European research project (InfraAUV), which is part of the EMPIR programme. In this project, novel in-laboratory and on-site calibration procedures for seismometers are developed. The in-laboratory calibrations are carried out using electrodynamic shakers to excite sinusoidal vibrations. These excitations are measured by the seismometer under test and a reference laser interferometer. The calibration process itself is already well-established, used with accelerometers and standardized in ISO 16063-11. However, several specialties of seismometers require consideration: 1) Frequency range and duration: the low frequencies (here, as low as 10 mHz) require a long measuring time. To reduce the time consuming and costly measurements, excitation methods with multiple frequencies have been developed and applied to seismometers; 2) Tilting: the high sensitivity of seismometers makes horizontal calibrations prone to deviations from tilt. The tilting results in components of the gravitational acceleration adding to the velocity signal generated by the exciter. If the tilting changes proportionally to excitation, it has the same frequency and phase as the excitation; 3) Electromagnetic disturbance:
some seismometers are sensitive to magnetic fields. Electrodynamic shakers produce inhomogeneous static and dynamic magnetic fields. Both can negatively affect the measurement result.
Reliable and comparable measurements of physical quantities require traceability to the international system of units (SI). Sound pressure is traditionally quantified using measurement microphones as transfer standards, for which the established primary calibration methods are currently limited to frequencies of 2 Hz and higher. These frequencies do not fully cover the range of interest for the International Monitoring System. For this reason, multiple primary calibration methods for airborne infrasound based on different physical principles are currently in development. The method presented in this talk utilizes the vertical gradient of the ambient air pressure as stimulus. A microphone under test is subjected to an alternating pressure by periodically changing its altitude. This principle has been realized in a calibration setup colloquially called the ‘microphone carousel’. In this setup, a rotating disk periodically changes the height of a microphone by about ±0.30 m. This subjects the microphone to a sinusoidal alternating pressure which is calculable in amplitude and frequency. With this method, measurement microphones can be calibrated in a frequency range from 0.1 Hz to 5 Hz with a planned extension to 10 Hz. In this presentation, the capabilities and limitations of the microphone carousel for the calibration of measurement microphones are discussed.
Hyperion Technology Group, Inc., along with the National Center for Physical Acoustics at the University of Mississippi, USA (NCPA ), has been developing an integrated calibration system for its line of infrasound sensors. This technology will allow self-calibration of the infrasound sensor in the field using existing installed equipment. While currently verified as an add-on for existing sensors, work is progressing to integrate the system into the sensor hardware for a fully self-contained deployment. The calibrator allows the sensor to function nominally without significant change in response. When activated by an external signal from existing digitizers the system will produce signals between 0.01 and 10 Hz at amplitudes greater than 30 Pa. We report on the theory, performance, and roadmap for implementation.
This poster presents the recent development and preliminary results of a compact laser refractometer based on a dual Fabry-Perot scheme at 1550 nm for dynamic infrasound pressure measurements. The measurement of the beat frequency between two lasers slaved to two Fabry-Perot cavities allows to follow the variation of the refractive index of the air, and thus to estimate the pressure inside the measuring cavity. Associated with a dynamic infrasound pressure generator developed at CEA, the generated pressure variation, controlled and repeatable, is directly printed on the measured beat frequency and compared to infrasound sensors. The main objective of this work is to demonstrate the capabilities of a compact laser refractometer whose simple and unique design has been adapted from static absolute pressure measurements to dynamic infrasound pressure measurements. The resulting sensor has demonstrated its ability to measure dynamic pressure over the entire infrasound frequency band, with excellent performance. Further studies on the specific shape of the frequency response are underway to evaluate the discrepancy between the measurements and the model, and to assess its uncertainty budget.
Over the past five years ASIR has adapted silicon audio (SiA) interferometer based sensors in our borehole broadband seismometer, model ASIR ABB, for making seismic observations in ~100 to 1000 m deep slim boreholes. The shifts are used to rapidly and accurately control a force-feedback circuit. The current ~60 mm outer diameter triaxial SiA sonde has been installed in drill hole with inner diameters as small as 76 mm and tilts with 15d.These sensors have a 3 dB frequency-response bandwidth of 120 sec to 1300 Hz, a clip level of +0.5 g, and a dynamic range of 172 dB. During a rapid, long tilt event, sensor settles with a 60s long exponential decay. Ranging from a plate boundary to an underground mine, and a subsidence site, these sensors have returned complete seismograms for nearby events ranging in size from M < -1.5 to M> 4.5. The SiA sonde’s ~1-10 s seismic event spectrum is well above background noise. At the plate boundary, the borehole SiA improved event detection over the local surface net by as much as one unit in magnitude.
ASIR ABBT (with Thermal probe) broadband seismometer is successfully used for micro seismic monitoring of the nuclear underground depository in Europe.
The Joint Task Force for Science Monitoring And Reliable Telecommunications (SMART) Subsea Cables is working to integrate environmental sensors (temperature, pressure, seismic acceleration) into submarine telecommunications cables. This will support climate and ocean observation, sea level monitoring, observations of Earth structure, tsunami and earthquake early warning and disaster risk reduction. We present an overview of the initiative and a description of ongoing projects, including: the InSea wet demonstration project off Sicily; the CAM ring system connecting the Portuguese mainland, Azores and Madeira (funded); progress towards installation of a cable connecting Vanuatu and New Caledonia; cable plans between islands of Indonesia; a planned cable from New Zealand to the Chatham Islands; a plan for a cable connecting Antarctica to New Zealand, and; a project to connect Europe and Japan via the northwest passage (Far North Fiber). These SMART systems are the initial steps to global implementation and will be influential in the final standards and policies that evolve. In addition to the diverse scientific and societal benefits, the telecommunications industry mission of societal connectivity will also benefit because environmental awareness improves both individual cable system integrity and the resilience of the overall global communications network.
Short period seismometers were the workhorse of the World-Wide Standardized Seismograph Network built in the 1960s. Because they have very low noise in the band of interest for array seismology, they continue to be used in arrays throughout the International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). In an earlier study, we used single frequency calibrations to validate a model that predicted the temperature sensitivity of the sensors used at the Yellowknife seismic array (YKA) to be -0.16%/°C. Given that seasonal temperature swings can exceed 50°C in a place like Yellowknife, the effect on sensor gain can be significant. Here we perform broadband calibrations to validate a model of the temperature dependence of the transfer function of the short period sensor. The model predicts the temperature dependence of the damping to be -0.14%/°C. We show how the temperature sensitivity of the damping results in a predictable seasonal dependence of the instrument gain and phase near 1 Hz. We recommend changes to the nominal response of the sensors in the YKA array, as well as routine broadband calibration of seismometers to detect similarly subtle effects which might impede the performance of the IMS.
The International Monitoring Systems (IMS) draft operational manuals for waveform stations require that IMS stations be calibrated regularly. Since 2012, the Provisional Technical Secretariat (PTS) had relied mostly on electrical calibration to meet that requirement. However electrical calibration has inherent challenges (no traceability, integration and sustainment issues, high operating costs, etc.). A part of the geophysical community, including station operators, has started performing regular calibrations by comparison against a co-located reference. This method allows a more systematic and centralized approach to calibration. Over the past few years, it has been increasingly used at IMS stations, particularly infrasound ones. In this context, the PTS is developing tools to support this alternative approach. We present CalxPy, a web-application developed at the PTS for the calibration of geophysical systems by comparison. With CalxPy, one can calculate, store, and display the response of a system for a given period, or track the evolution of the response against time or environmental variables. CalxPy allows the reporting of calibration results in IMS2.0 format. CalxPy supports the Initial calibration and on-site yearly calibration processes, as well as data quality control. CalxPy can be deployed in the IDC pipeline and in NDC-in-a-box.
Version 3 of Geotool (aka GeotoolQt) provides a powerful, easy to use interface with data import facilities for International Monitoring System(IMS) and non-IMS stations. IMS-related information, including parameters of events and arrivals, as well as waveform data, can be imported directly from VDMS while non-IMS information can be directly retrieved from FDSN web services, including but not limited to IRIS and GFZ. We demonstrate these new features based on a use case where a reviewed Event Bulletin (REB) event and its related data is imported from VDMS. Additional waveform data of non-IMS stations are retrieved from GFZ. The REB event parameters (location, magnitude, gap etc.) are then refined with the additional arrivals on the non-IMS stations resulting in an improved event location and reduced error ellipse.
The Flexpart Atmospheric Transport Model (ATM) is traditionally driven by ECMWF and GFS meteorological model inputs. Flexpart-WRF is a variant of the standard model that accepts a wide range of Weather Research and Forecasting model (WRF)-generated meteorological inputs to support very high resolution simulations over customized domains. The chain of activities needed to produce custom meteorology files from WRF and make them available to Flexpart-WRF for a successful simulation is complex and prone to failure for a number of reasons, and the work described here is aimed at packaging all of the complexity into an easy-to-use system. Building on the experiences gained from an exploratory prototype system built several years ago, this Enhanced High Resolution Atmospheric Transport Model (EHRATM) system is being developed in a Python-driven environment to support simulations ranging from relatively simple and straightforward, to complex simulations with special requirements. Adopting the philosophy of some other well-known Python packages, our goal is to “make easy things easy and hard things possible.” This work is ongoing, and the presentation will describe a detailed overview of the project and its current status.
The International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) is designed to detect nuclear explosions with a minimum yield of one kiloton of TNT equivalent worldwide. The IMS uses four verification technologies - seismic, Infrasound, hydroacoustic and radionuclide - in a synergistic manner to enable the detection, location, and identification of potential nuclear explosions. Each of these technologies has advantages and limitations, and overcoming these limitations is crucial for the verification regime and for the civil and scientific applications of IMS data and International Data Centre (IDC) products. This poster will demonstrate the use of multi-technology fusion between the four technologies and other national technical means, such as satellite imagery, to overcome gaps and limitations through examples of events in Tunisia (quarries and mines), where infrasound and seismic data is used to distinguish sources in areas with a low seismic network coverage. Another example is using infrasound and satellite imagery during Stromboli eruptions, where other forms of synergy will be explained. The poster will highlight how classic synergy and other forms of synergy between technologies can fill gaps and improve results by utilizing the strengths of each technology while overcoming its limitations.
The trigger for the 2019 National Data Centre Preparedness Exercise (NPE) scenario (a fictitious nuclear explosion) was based on a real ML 3.7 shallow tectonic seismic event within an earthquake swarm in southern Germany, near the city of Constance. The event occurred at 23:17 UTC on the 29 July 2019 (e.g. see International Seismological Centre bulletin) and is not found in the International Data Centre (IDC) standard products (Reviewed Event Bulletin and Standard Event List 3) or the Late Event Bulletin, even though eight International Monitoring System (IMS) seismic stations are within 1500 km of the event and the IDC threshold monitoring detection capability for this period suggests it should be detected. However, this event was clearly observed at IMS stations in Switzerland (DAVOX), Germany (GERES), Czech Republic (VRAC) and the UK (EKA), as well as at numerous non-IMS stations across Europe. An event was formed by the NET-VISA algorithm and listed in the IDC NET-VISA database ("vSEL3"). We re-analyse the NPE 2019 trigger event (and selected others in this swarm of earthquakes) using both IMS and non-IMS stations and demonstrate the potential important role of auxiliary IMS seismic stations and local seismic networks as part of the expert technical analysis.
The International Monitoring System (IMS), the International Data Centre (IDC) and the on-site inspection (OSI) are the technical elements of the CTBT verification regime (CTBT Art. IV); these are designed to detect any nuclear explosion anywhere (underground, under water or in the atmosphere) and to investigate any findings associated with its occurrence. The role of the IDC is to operate the IMS, receive, collect, analyse, report-on and archive the data; furthermore, to apply on a routine basis, automatic and manual data processing in order to produce and archive standard IDC products on behalf of all States Parties (these are provided to all at no cost) without prejudice to final judgments with regard to the nature of any event, which shall remain the responsibility of the States Parties. This poster gives a portrait of the construction of the IDC, obtained by the Provisional Technical Secretariat since the approval by States Signatories on the restructuring of the IMS and IDC Divisions.
The use of synthetic aperture radar (SAR) data is a powerful and well established method for remote sensing that enables high resolution measurement of important geophysical parameters such as surface topography, deformation or subsidence of the surface as a result of various processes (earthquakes, tectonic activity, movements of glaciers, mining activity, etc.). The present study illustrates the InSAR technique applied for recent M6+ earthquakes in the region of south-east Europe including Greece and Turkey. Data from both Sentinel-1A and -1B is used. This constellation shares the same orbit plane and include C band imaging operating in four exclusive imaging modes with different resolution (down to 5 m). As a results interferograms and deformation maps are build using SNAP and ArcGIS software. For each deformation map a profile plot of the displacement is depicted. The obtained results can be used in the evaluation of the earthquake deformations supplemented with geological and geophysical information.
Acknowledgements: This study is conducted within project “Analysis of seismic sources using InSAR satellite data for the Balkan Peninsula region” funded by the National Scientific Program “Young Scientists and Postdoc-2", 2022 by Ministry of Education and Science of Republic of Bulgaria.
Compared to many other Treaties and international conventions, one of the unique and in-built characters of the Comprehensive Nuclear-Test-Ban Treaty is its Protocol. Considering the sensitivity that is required to address the verification regime, having limitations and even restrictions on matters pertaining to national security is understandable. Yet, listing the tools as a part of the Protocol is having a negative impact particularly on on-site inspection (OSI) functions, as accommodation of novel technologies is discouraged. The “National Technical Means (NTM)” might bridge the gap between the advanced technologies and listed stalled tools of the Protocol. The NTM could serves well in the planning stage of the OSI as the waveform analysis of the verification regime (International Monitoring System) could be integrated with NTM inputs. However, once the inspection is initiated the inspected State Party (ISP) could limit the NTM information flow and most importantly the inspection team has to rely on the outdated tools provided in the Protocol for the OSI inspection. Maintaining the balance between the successes of OSI objectives while securing national interests, the States Parties shall identify an agreeable solution towards upgrading OSI tools in par with the technological advancement.
Despite remote waveform analysis and nuclear signatures, an on-site inspection (OSI) remains as the final verification regime for potential indicatives of ambiguous events of CTBT violation(s), which is designed to produce conclusive evidence. As the potential violator, the inspected State Party is accustomed to the procedures and tools available to the OSI, the success of an inspection depends on intuition and out-of-the box thinking by the inspection team. In most cases the scenarios that an inspection team has to address could not be foreseen, hence, critical thinking and prompt actions produces fruitful results of the inspection. As “unexpected” is the norm of such inspection, synergy of observations and knowledge of inspectors has a paramount importance towards fulfilling the OSI task.
COVID-19 situation made remote training of OSI tools and procedures mandatory, driving the Inspectorate towards semi-automatic mode providing limited opportunities for team-building, while hardly having sufficient room for diversified sentiments. Previous Integrate Field Exercises (IFE) reveals the importance of having well-coordinated closely-bounded team, with a space for constructive personal initiatives bringing into ever evolving on-field inspection progression. The OSI regime shall implement process to assess the quality of remote training compared to in-house team-building and need to device a cost-effective mechanism filling the gaps, accordingly.
In practical scenarios, on-site inspection (OSI) could be conducted in a place of nowhere which is in an environment with radiation hazards. Both OSI training courses and Integrated Field Exercises should take into full consideration such conditions, so as to build up the inspection team's operational capacity for real cases. From the perspective of inspection team functionality and inspection team health and safety, besides suitable radiation protection solution for inspection team members, radiation hazards prevention solution should take into consideration not only the radiation monitoring of inspection team members in real time, checking the radiation level of inspection team members at the base of operations (BOO) when they are coming back from the inspection area, but also the checking of the radiation level of OSI equipment. This work would carry out a systematic solution to radiation monitoring both of the inspection team members and OSI equipment. Equipment includes radiation protection suits, wristband dosimeter for inspection team members, hands and feet radiation checking equipment, whole body radiation checking equipment at BOO and radiation checking equipment for inspection equipment and support vehicles.
The International Committee on Global Navigation Satellite Systems (ICG), as an international multilateral platform under the United Nations Outer Space Agency (UNOOSA ), is promoting voluntary cooperation on matters of mutual interest related to civil satellite based positioning, navigation, timing, and value added services. At the same time, the On-Site Inspection (OSI) Division of CTBTO, as the neighbour of UNOOSA, is developing its essential Global Navigation Satellite Systems (GNSS) positioning and navigation equipment for on-site inspection. As a matter of fact, GNSS positioning and navigation equipment is still a classification of equipment whose application to OSI is still a pending issue in Working Group B and policy making organs of the Comprehensive Nuclear-Test-Ban Treaty Organization. This work would propose a 4-mode GNSS positioning and navigation solution to OSI, which would involve interagency efforts and all the four major United Ntions declared global satellite navigation systems, including GPS, GLONASS, GALELIO, and BEIDOU. This solution is based on the ICG suggestion of compatibility, interoperability, and transparency among all the GNSS systems. It would not only provide a more reliable and practical solution to in-field OSI positioning and navigation, but also take into consideration keeping the balance between technical sufficiency and practical feasibility.
The conduct of an on-site inspection (OSI) in an area other than flat has been considered as an additional challenge to the intrinsically demanding characteristics of such a mission. An effort has been made in the past by the Provisional Technical Secretariat (PTS) to identify those difficulties that are particular to the conduct of an OSI in a mountainous environment and to develop appropriate solutions, but always as part of theoretical exercises. The Field Test of geophysical techniques for the detection of deep observables conducted in the Austrian Ybbstal Alps in September 2022 represented the first OSI field activity ever conducted in a pure alpine environment. The Field Test served to understand the current capabilities of the PTS to solve the technical and operational issues faced in such an arduous setting. This paper lists the challenges encountered during the planning and execution of the Field Test, describes the actions taken to mitigate them, and identifies lessons to be learned for the successful conduct of future OSI activities in mountainous areas.
To meet the requirement to certify that on-site inspection (OSI) deployed equipment has been calibrated, maintained and protected, as mandated by paragraphs 38 and 39 of Part II of the Protocol to the CTBT, the Provisional Technical Secretariat (PTS) has introduced a number of supporting tools. This paper highlights initiatives to protect equipment and those used to support and record the maintenance and calibration of equipment. The system for maintaining and managing OSI equipment and software, EIMO, is now used on a daily basis, facilitating work at the CTBTO TeST Centre, where it is used as the central database for all OSI deployable equipment. The custom browser based application has been further expanded with the creation of new instances: a training instance for capacity building and another for the testing and development of new functionalities. Among the new features reported in this paper is the introduction of RFID tags and checks, which represent an important milestone required for the protection of equipment. To further protect deployable equipment, storage containers now have individual locks with keys stored in a two-factor authentication key cabinet, with access assigned on a key by key basis. Similar access control is also being considered for use during OSI deployment.
The behavior of the inspected State Party (ISP) is one of the elements that has not been adequately analysed in an on-site inspection (OSI), though the same has a grave impact on the inspection. Generally, three distinct common scenarios lead to a request for a review: ambiguous natural events that associate with partial nuclear emission, nuclear accident and a violation of the Treaty via a nuclear test. In these three instances, the ISP reaction and the behaviors would be very different; close cooperation is expected in the first event as both ISP and the inspection team are driven towards the common objective of assessing the cause of the incident. A partial concealment is expected in the second scenario, particularly driven by “national pride”. The third situation, if occurs once the Treaty enters into force, the behaviors of the ISP would be very different with a minimum degree of cooperation. In such a situation, the ISP is destined to disturb the inspection while adhering to the minimum Treaty obligations. Inspection team engagement with the ISP shall be trained and practiced for distinct, yet common scenarios for an effective inspection.
An on-site inspection (OSI) is a complex forensic investigation that requires a systematic approach that considers the operational, technical and time constraints specified in the Treaty. The information led inspection team functionality (ITF) and field team functionality (FTF) are comprehensive and systematic concepts that serve as essential guidance for the inspection team to conduct its missions during an OSI. This poster visualizes the ITF/FTF concept as three operational cycles: logic cycle, decision cycle and operation cycle. It illustrates how the concepts run and guide the inspection team's daily inspection activities.
The airborne simulator, used to develop and test on-site inspection (OSI) airborne equipment configurations as well as train surrogate inspectors, is being upgraded to provide more realistic training opportunities. The simulator, a converted Mi-2 airframe, now features adjustable window panels and is being supplemented with a projection system that will provide real world views of the terrain. The projection system, once fully implemented, will be comprised of a surface that extends in an arc from the nose of the airframe to the rear most window of the fuselage onto which real world views are projected. The projection and flight parameters will be controlled by the session tutor via a computer. Trainers will also be able to preload flight lines with information including, speed, banking angle during turn etc., and alter flight parameters in real time e.g. speed, pitch, roll and yaw. The project, due for completion in 2023, will provide an enhanced training experience with realistic views of the terrain at flying heights of 100 to 1500 metres above the ground.
As part of the GIMO software platform, developed to facilitate the implementation of on-site inspection (OSI) search logic, specific applications have been developed to meet the requirements of the on-site field laboratory. Broadly, these provide a framework and tools to meet the requirements of environmental sample chain of custody, and sample management in the laboratory including sample measurement and analysis. The architecture used as a basis for field laboratory applications development, deployment and operation are summarized. Data security considerations are highlighted, with particular emphasis placed on the development of a ‘kiosk’ application for the laboratory chain of custody tablet, which restricts access to the relevant chain of custody application only. The workflow, together with the tools developed to facilitate the receipt of an environmental sample in the laboratory, its storage, measurement and analysis are presented. Interactions between the GIMO field laboratory application, measurement systems and analysis software ONIAB are highlighted. This paper also addresses the means by which reports generated for a sample are moved to the ‘receiving area’, reviewed and classified, associated with sample id and then transferred to the relevant ‘receiving area’ based on classification status and then on to the ‘working area’.
According to Iraqi Radiation Law No. 99 of 1980, the Radiation Protection Center (RPC) is a regulatory body in Iraq responsible for monitoring radiation workers. This is done by providing them with personal dosimetry TLD, ED, monitoring Iraqi environment (soil, water, air) and gamma radiation monitoring. For their health and safety we prepared and designed the wallet card to maintain the radiation doses of an inspection team on-site within ALARA principle to reduce the potential doses of radiation and radioactive materials. The wallet card includes: zone, radiation field, contamination and control and with values for each one (i.e. background, safe, low, medium and very high ). All values are according to the instructions of the IAEA. This wallet card is very important for the inspection team in the inspection area because if a zone is very high (>10 mSv/hr ) that needs pre-work planning; pre-work planning; ALARA controls; health and safety team in the inspection area TL, approval: written work plan and Director General authorization. Also the wallet card supplied by RPC to inspectors in radiological emergencies as the RPC is a member of National Emergency Committee, in Iraq with other ministries which are MOST, Health, Defense, etc.
Conducting an on-site inspection (OSI) requires a large variety of resources, i.e. human, financial, technical, etc. The preparation of human resources is a challenge due to the disciplinary diversity of equipment and procedures used. The design of a training programme for future surrogate inspectors is an important task because the human factor largely determines the effectiveness and success of the OSI. The goal of this work was to assess the effectiveness of the OSI Training Programme through a comparative analysis of applied approaches and their interrelationships. The comparative evaluation of online and on-site training methods was done based on six criteria: flexibility, practical skills training, theoretical knowledge, and training capacity. The weaknesses and strengths of each training method were identified. A new training concept including optimization of used methods was proposed, resulting in a 10–15% improvement in the effectiveness of the entire training programme on average. New ways of online learning and its integration with on-site courses were presented. The Kirkpatrick model and the results of the first in-field operation support refresher course were used to conduct a preliminary evaluation of the new OSI training programme, which demonstrated the concept's effectiveness and cost efficiency.
Mantle transition zone is delimited by two seismic discontinuities at 410 km and 660 km. These are imaged under the northwestern corner of South America using the receiver function technique and a seismological record with up to 30 years registered by the national seismological network of Colombia. Significant variations and spatially systematic in the discontinuity depths were observed. The mean depths for the mantle transition limits are 412±5.3 and 671±5.9 with a mean thickness of 258±6.5, this value is similar to the global average. The low correlation between the discontinuities is caused by the significant depth variation and the thicker mantle transition zone in some areas is because the Nazca and Caribe plates subduction under South America plate. On the other hand, mantle transition zone thinning beneath the Nazca plate possible is due to the Malpelo ridge presence. These observations together with seismic tomography investigations could confirm the interaction of Nazca and Caribe plate within the upper mantle and mantle transition zone hydration.
Various seismic precursors are known to be preceded small to large earthquakes. These pre-earthquake signatures which may continue to several months and/years would be employed for earthquake forecasting. In the present investigation precursory seismicity patterns were discussed and used for the identification of precursory swarm to forecast the location of future earthquake in the South Central Tibet Region (SCT) Himalaya. The seismicity data compiled from various catalogues for the period 1963-2006 with mb ≥4.1, have been used to understand the occurrence of three medium size earthquakes sequence 1996 (mb 5.9), 1998 (mb 5.8) and 2004–2005 (mb 6.2, 6.3). Analysis indicates that these earthquakes were preceded by well-defined patterns of precursory swarms and seismicity varies as low-high-low phases in episodic manner. We found two anomalous seismicity patterns having similar spatial and temporal distributions separated by about 15 month’s duration during January 2002-February 2003 and June-August 2004 in the same area but without a mainshock till 2007? Based on analyses of above three events occurrence mechanism and spatio-temporal and focal depth patterns during 2002-2003 and 2004 a localized area (29.6o-30.1o N, 87.7o-88.1oE), could be a potential zone for impending medium size earthquake (M≥6.0) in depth range 25 ± 15 km.
Madagascar has never experienced any volcanism activity. The volcanisms dated from the cretaceous period are almost all around the coastal zones and Cenozoic at the central part and in the north. Seismically, the central part is the most active. An ancient crater even had smoke from the inside slope. The laboratory of Seismology and Infrasound at the IOGA uses the national seismic stations network to monitor earthquakes and tsunamis. The CTBTO station, OPO, has an important place for monitoring earthquakes in the area. Many seismic crises are recorded from the central volcanic areas and this station is well placed to get better detection. To study the volcanism of the central part of Madagascar, we consider the observations in site, the earthquakes history and a 3-D tomography of the lithosphere. Even though Madagascar has had no volcanic activity for millions of years, the instability of the lithosphere presents the possibility of reactivation of the recent Cenozoic volcanoes is supported by the tomography results.
Seismic hazard assessments (SHA) have not been conducted in large parts of Sub-Saharan Africa (SSA) due to incomplete earthquake catalogues, sparse seismic networks, etc. raising concerns on needed information for planning and disaster risk management. The aim of this study is to bridge the research gap using modern techniques for across the board SHA. Updated catalogue from local networks, CTBTO's National Data Centres, International Seismological Centre and publications spanning 1615-2022 with threshold and maximum moment magnitudes (Mw) of 4.0 and 6.8 formed the dataset. The catalogue, which was declustered and harmonized to Mw, was used with available geological data to delineate area source zones and computation of earthquake recurrence parameters. Four ground motion prediction equations for tectonically similar regions to SSA were implemented using logic tree formalism in the calculation, with all equations weighted equally. With a 1-1000 year period considered, the computed Gutenberg-Richter b-value, activity rates, and regional maximum possible magnitudes ranged from 0.69 to 1.0, 1.6 to 2.1, and 5.2 to 7.2 respectively. Peak ground accelerations ranged from 0.02g to 0.2g for a 10% chance of exceedance in 50 years and seismic hazard maps for 0.1s, 0.2.s, 0.3s, 0.5s, and 0.15s periods were produced. Results are expected to make significant contribution to planning in the vast region.
Contrary to earlier beliefs of being aseismic, Nigeria witnessed numerous earthquakes between 2018 and 2022. The 5-7 September 2018 events with moment magnitudes 2.5-3.7 and intensity II-V, occurred in Mpape, Abuja, about three kilometers from Nigeria's presidential villa and critical facilities such as dams, nuclear research reactors, etc. This study aims to adopt an integrated investigation using seismological, geophysical, geological and space-related techniques to ascertain the causes and implications of the seismic activities. Data were acquired from field surveys, local networks and international agencies. A probabilistic seismic hazard assessment was also performed throughout Nigeria and its surroundings. Findings showed the causes of recent seismicity were tectonics, as evidence of a fault was established at Mpape. The b-value, activity rate and regional possible maximum magnitude were 0.69±0.07, 1.684±0.462 and 6.7±0.34. Annual activity rates of earthquake magnitudes 2.0-3.0 are high, while the likelihood of earthquakes of magnitudes 5.0-6.0 occurring annually, 50 years, 100 years, and 1000 years were 0.7%, 75.81%, 99.70%, and 100%, respectively. Return period of magnitude 6.0 earthquake is 143 years. The PGAs for a 10% chance of exceeding in 50 years range between 0.01-0.08g. Results are useful for planning and as baseline parameters for establishing seismic building codes in Nigeria.
Tehri dam is an earth and rock filled dam situated in the 700 km long earthquake gap in between the 1905 Kangra earthquake in the west and the 1934 Bihar-Nepal earthquake in the east region of Himalaya and built across the Bhagirathi River. The purpose of the dam is to provide irrigation and generate hydroelectricity for nearby areas. It is operated and maintained by Tehri Hydro Power Corporation and monitored by the Department of Earthquake Engineering, IIT Roorkee. The hypocentre parameters of three local events (1.3≤Mw≤3.2) that occurred around the Tehri dam region are estimated using the HYPOCENTER program given in SEISAN software and the code EQK_SRC_PARA was used for earthquake source parameters estimation in which source model fitted in displacement spectra and acceleration spectra. The seismic moments (M0), Brune stress drop and source radii ranges from 2.2×1018 to 6.6×1020, 1 to 63 bars and 65.1 m to 667.4 m, respectively. The maximum stress drop of 63 bars is observed at a source radius of 146.7 m. The radiated seismic energy varies from 1.1×1014 to 3.9×1016. The radiated seismic energy follows the relationship with seismic moment as Es= 3×10-5M01.013. Study advocates that the seismic moment has an increasing trend with increase in source radius.
Nyepi is a rare activity in the world that only exists in Bali, where all human outdoor activities stop for a day. This study used Nyepi to measure its impact on ambient noise in Denpasar, the capital of Bali province. We used broadband and period seismometer, which operates 24 hours a day, to measure the difference before, during, and after Nyepi. Results of processing signal data using methods Horizontal-to-Vertical Spectral Ratio shows an increase in the dominant frequency during Nyepi, but fluctuating every hour. Overall, the 3-day data dominant frequency is in the range of 1.60-1.70 Hz. The amplification factor at the Nyepi day is at the lowest value compared to the day before and after Nyepi. The seismic vulnerability index when Nyepi is at its lowest value compared to the average before and after Nyepi. The current seismic vulnerability index value of Nyepi is 23 998. This value is 3.906 lower than the average day before and after Nyepi. This suggests that a decrease in human activity has an impact on decreasing the seismic vulnerability index.
Seismic events recorded by the Northwest Mexico Seismic Network are initially located through an automatic system and then manually refined by an analyst. However, this procedure is far from ideal because the considerable dispersion observed in the earthquake cluster does not allow a precise source location. To obtain higher resolution hypocentral locations and thus provide more knowledge and definition of the zone's tectonics, the double-difference algorithm (HypoDD) will be applied to a seismic sequence near the town of Valle de la Trinidad, northern Baja California. This sequence began on August 17, 2020, with an earthquake of local magnitude ML=5.1, and the catalogue to be used for this study will cover until June 2022, resulting in a total of 888 seismic events. This seismicity is still active today and comprises a range of magnitudes from 1≤ML≤5.1. Seismic source parameters of the sequence will be calculated, with particular attention to the seismic moment (Mo), to observe the behaviour of the Mo-ML relationship that several authors have reported as non-linear for southern California and northern Baja California, specifically in the magnitude range 4≤M≤6.8.
The Jordan Seismological Observatory (JSO) participated in a workshop in Nepal on improved seismic event location using the regional seismic travel time (RSTT) method. The JSO contributed with a list of selected seismic events to be analysed and tested.
After selecting the most matched event from the list and ensuring it was compatible with RSTT module specification, the result of the analysis of event location by RSTT wasn’t within the expected range. This was because of the high azimuth gab due to lack of stations coverage from all direction.
The JSO should start to solve this issue by reducing the azimuth gab by adding stations from Lebanon. Additionally, a study undertaken to add new seismic stations where no station coverage is found and by getting data from the International Data Centre or data from other seismological centers.
The transversal fault zone Vlora-Elbasan-Dibra (VED) with Northeastern strike dislocated the structure of Albanides along all their width. This transverse fault is the most active zone, which has generated earthquakes along its entire length. This study covers the period from 2004 to 2022.,This time period was chosen because it consists of the increase in the installations of digital broadband stations. The seismicity that occurred along VED was well detected by the Albanian Seismological Network and Hellenic Unified Seismic Network, but moderate earthquakes are recorded by almost all seismic stations in the region. For this purpose, data used in this study are broadband waveforms retrieved from online FDSN services which belong to different networks. The data are analysed using the Pyrocko package. This seismic zone has presented interesting characteristics by being activated in different parts of it, continuing from Dumre diapir with complex fault but dominated by dextral strike-slip mechanisms from Elbasani to Dibra region it works as a normal fault with a dextral component of strike-slip, and form Dibra to Mavrovo in North Macedonia as a normal fault in N-E direction.
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
We investigated seismic attenuation characteristics of the Lower St. Lawrence seismic zone. This zone is located ~400 km downstream from Quebec City and is between the Quebec North Shore and the Lower St. Lawrence. Coda Q was determined using 847 earthquakes (2.0 ≤ M ≤ 5.1) recorded on ten stations of the Canadian National Seismic Network (CNSN) in Quebec from 1985 to 2022. We find that the lowest overall average of Q0 (Q at 1 Hz) values are at the three stations (GSQ, ICQ and SMQ) within 100 km of a moderate earthquake of mN 5.1 in 1999 (e.g., Q0 of 81, 88 and 80, respectively). We determined temporal variations in attenuation following the 1999 earthquake. The overall average of Q0 decreased from 87 (before the mainshock) to 77 (GSQ), from 92 to 85 (ICQ) and from 88 to 82 (SMQ). These results are in agreement with global studies that show a decrease in Q0 following a major earthquake, likely the result of increased fracturing and fluids in the epicentral region. An average for all the data results in a Q relationship of QC = 86f^1.07 for the frequency band of 2 to 16 Hz for the entire region.
Location algorithms have relied on one-dimensional (1-D) velocity models for fast, seismic event locations. The fast computational speed of these models made them the preferred type of velocity model for operational needs. Three-dimensional (3-D) seismic velocity models are becoming readily available and usually provide more accurate event locations over 1-D models. The computational requirements of 3-D models tend to make their operational use prohibitive. Comparing location accuracy for 3-D seismic velocity models tends to be problematic as each model is determined using different ray-tracing algorithms. Attempting to use a different algorithm than used to develop a model usually results in poor travel-time prediction. We have previously demonstrated and validated the ability to quickly create 3-D travel-time correction surfaces using an open-source framework (PCalc+GeoTess, www.sandia.gov/salsa3d, www.sandia.gov/geotess) that stores spatially-varying data, including 3-D travel-time data. This framework overcomes the ray-tracing algorithm hurdle because the lookup tables can be generated using the preferred ray-tracing algorithm. We have created first-P 3-D travel-time correction surfaces for several publicly available 3-D models (e.g., RSTT, SALSA3D, G3D, DETOX-P2, etc.). We demonstrate using these correction surfaces to compare models fairly and consistently for seismic location accuracy via a set of validation events and International Monitoring System stations.
Unknown to the most Egyptians, of all natural hazards earthquakes pose the greatest damage potential. Large scale events are fortunately quite rare, however, if they strike, they can cause far reaching and very costly damage, which lead to potentially hundreds or even thousands of fatalities. So far, earthquakes cannot be prevented or even reliably predicted. But, due to extensive research, much is now known about how often and intensively the earth could shake at given location.
Probabilistic seismic hazard analysis (PSHA), at the national level, enables societies to make well informed decisions on earthquake safety. On a technical level, a PSHA defines, for building engineers, the kind of ground motions which can be expected for an earthquake and to couple them to the response of local soil and the building characteristics. The design response spectrum considering the different soil conditions for five selected sites along the northwestern coast of Egypt, have been estimated. Seismic hazard maps and design response spectrum are illustrated as selected hazard output for the selected five cities (Alexandria, Alamein, Dabaa, Marsa Matrouh and Negelah).
Jordan seismic activities are attributed to the Dead Sea Fault. It is an active transform fault extended from the Red Sea to the Taurus/Zagros Mountains. In cooperation with the Comprehensive Nuclear-Test-Ban Treaty Organization we used available local data, IMS data, and arrival times of seismic events with a magnitude greater than 3.5 to be able to validate the location accuracy. The selected events are local and regional and should satisfy the criteria according to Bondár et al. (2004) to assign and determine the events as ground truth events.
The Caucasus region lacked a comprehensive catalog, despite its role in Arabia-Eurasia convergence between the Black and Caspian Seas. The Lawrence Livermore National Laboratory and the Institute of Earth Sciences (IES) at Ilia State University generated a new, comprehensive seismic catalog for the period from 1951 to 2019 for the Caucasus region by combining data in the IES bulletin with bulletins of the Republic Seismic Survey Center of Azerbaijan, monitoring centers in Turkey and Armenia, and the ISC.
We present ~20,000 newly relocated events in this bulletin. We relocated each event using the single-event location algorithm iLoc and regional seismic travel time predictions and identified GT events. We relocated the entire seismicity of the Caucasus region with the multiple-event location algorithm Bayesloc, using the iLoc results as initial locations and the GT events as constraints.
We show that each relocation step leads to significant improvements, as indicated by tightening of event clusters. The improved view of the seismicity reveals a narrow band of crustal events along the southern flank of the Greater Caucasus we interpret as a megathrust, and confirms both a region of deep seismicity beneath the northeastern Caucasus and a possible area of slab detachment in the central part of the range.
The objective of this study is to characterize the recent stress distribution on the Main Marmara Fault (MMF) through quasi-static modeling of inter-seismic strain accumulation by considering past earthquakes and heterogeneous interseismic coupling. Since the MMF is prone to creating large earthquakes, it is crucial to infer the state of stress on the fault. The obtained pre-stress distribution will serve as a basis for simulating dynamic earthquake ruptures via derived stress changes and calculating a realistic map of the peak ground velocity. Due to lack of observed data for future events, International Monitoring System seismic stations may be used to verify future ground motion simulations for destructive earthquakes. To calculate accumulated stress distribution on the locked part of the MMF, interpolated GPS velocities and slip-rates along the unlocked portions of the fault interface are used as boundary conditions. Previous studies of interseismic coupling and seismicity studies including the repeating earthquakes are also considered. A three-dimensional-FEM mesh for the region is built using a mesh size of ~500 m. Consequently, heterogeneous stress distributions are obtained via elaborative usage of recent geodetical investigations. Because the magnitude of stress can’t be measured within the earth crust, such heterogeneous stress change calculations are crucial.
The San Miguel volcano is considered one of the most active volcanoes in El Salvador due to its multiple eruptions; however, its structural properties are not fully understood. Four broadband seismometers were deployed by the Ministry of Environment of El Salvador from February 2014 to April 2014. We analysed ambient noise data using the spatial autocorrelation (SPAC) method and seismic interferometry technique. The SPAC method enabled us to calculate the phase velocity of the surface waves from 0.2 to 1.0 Hz. We derived the SPAC coefficients for each sensor-to-sensor pair (1.5–5.5 km). We directly converted the derived SPAC coefficients to Rayleigh wave phase velocities between 0.2 and 0.4 Hz and inferred phase velocities above 0.4 Hz using the zero-crossing frequencies. We also calculated Rayleigh wave group velocities with seismic interferometry, for each sensor-to-sensor pair. The combined use of the two methods offered ways to gain information about the subsurface seismic velocity structure from the same dataset. Considering the fundamental mode phase and group velocities, the resultant dispersion curve was obtained in a frequency band of 0.2-1.3 Hz. Our results made it possible to perform a joint inversion of phase and group velocities to obtain the S wave velocity structure of the volcano.
In 2018 a series of large damaging earthquakes in Lombok-Indonesia occurred close together, causing many casualties and property damage. Therefore, we conducted source models composed of asperities for the 2018 Lombok Earthquake sequence using strong motion data and estimated by the empirical Green’s function method. We simulated the target event with a smaller event in the surrounding area. Then, the source model parameters were determined by comparing the synthesized to observed broadband ground motions. We obtained the best fit of the size and rupture starting point of the strong motion generation area (SMGA) by doing a grid search calculation. We found that the pulse waveforms extend radially toward the bottom right-hand direction due to the forward directivity effect. Furthermore, there is a relationship among the foreshock, the mainshock, and the largest aftershock that may have triggered each other and have similar source characteristics with rupture directions.
Seismic array was developed in 1960s mainly to enhance signal noise ratio. Since then many array have been deployed around the world with varying geometry and aperture length based on the purpose of study. Many seismic analyses techniques have been developed, such as beamforming and F-K analysis, in order to achieve the desired results from the data. Hoqain seismic array in Oman is one of the local small aperture array that was deployed in March 2015 with nine 3-C stations and a total aperture of around 2 km and set in complex geological setting of ophiolite and sequence of sedimentary rock in northern Oman. A python scripts is utilized to calculate beam power by scanning the data every one second around the frequency of interest. This study includes searching for slowness and back azimuthal values that can give the optimal beam out of the nine stations. STA/LTA detector is applied for an auto picking through a continuous data stream. The study is able to detect some minor earthquakes around Oman Mountains that our local network is not dense enough to detect. Some of these events can be correlated to known rock mining activity while others could be of tectonic origin.
El Salvador is located in northern Central America, along the Pacific Ocean margin. San Salvador is located in a region with a high rate of seismic activity, as it is part of the El Salvador Fault Zone (ESFZ). Currently, there is no detailed information on the stress and deformation regime in San Salvador. In this sense, the present project is oriented to the analysis of the local seismotectonic characteristics, which are intended to be essential inputs for the analysis of the seismic hazard and risk in the capital of the country. It is proposed to elaborate stress and deformation maps based on inversion of focal mechanisms and data from the Global Navigation Satellite System (GNSS), to carry out a statistical analysis of the seismicity recorded from 1984 to 2021 and to correlate macroseismic data with the stress regime. With the inversion of the focal mechanisms, we seek to obtain the main stress axes in the area and calculate the shape factor, which is a measure of the relative magnitude of the predominant stresses. Finally, the classification and inversion of focal mechanisms in the study area will allow a better understanding of the seismic source, whether volcanic, tectonic or anthropogenic.
Seismic site effect is a dominant parameter of seismic hazards analysis. We present an experimental study of microtremor data to investigate the dynamic characteristics of soil and structures at the Chiang Mai basin, Northern Thailand. The Chiang Mai basin was constructed on terrace sediments and alluvium sediments. The horizontal vertical spectral ratio (HVSR) analyses of ambient noise data at 101 sites were processed and quality controlled, then interpreted for the amplification factor and fundamental resonance frequency. The results indicate that the low resonance frequency ranges between 0.15–0.4Hz in the middle and indented to the west of the Chiang Mai basin, which is proximity to the location of the basin depocenter. The western edge of the basin has distinctly low frequencies before the highland, indicating that the western edge of the basin has a steep slope. The amplification factor ranges from three to five times in middle of the basin. We also evaluate the shear wave velocity (Vs30) using the HVSR inversion technique, where most of the basin area classified as site D soil (stiff soil) relative to alluvium sediments and the class C soil (very dense soil) conform to the quaternary sediments area are located on the eastern edge of the basin.
This research is to study earthquakes strong motion. According to the fact that the factor affecting peak ground acceleration (PGA) is distance, measurement PGA of earthquakes events were collected and have been selected and analysed to find the ground motion prediction equation (GMPEs) in Thailand to estimate seismic hazard or an effected area during a future earthquake in Thailand. The equation can be used to improve the accuracy of seismic hazard maps of Thailand. From a study of 28 local earthquake events, magnitude 3.0 to 6.4 range from 10 km to 600 km from the epicenter with 465 values of data set from seismic stations. In order to make the highest R-squared (R2) of the peak ground acceleration, the researcher has divided GMPEs into three equations according to earthquake magnitude which are less than 4.0, between 4.0 - 4.9 and 5.0 and above. The result of R2 of the peak ground acceleration is 86.63%, 86.17, 57.96, respectively. By comparing to six GMPEs equations in other countries, it is found that Jain's equation conforms to this research for all three equations. Therefore, these equations and Jain's equation can be used as GMPEs, as optional.
Variations in strain/stress and fluid content can change seismic velocities in the subsurface. Monitoring velocity changes, e.g. using ambient seismic noise, may thus constrain these variations as well as the material elastic properties and their non-linear behaviour. In our study we investigate variations of seismic velocity on a short time scale. We use coda wave interferometry to inspect continuous data from the GERES array (CTBTO network) in southern Germany. This results in relative seismic velocities (dv/v) that show temporal variations on the order of $10^{-4}$. Spectra of the velocity time series contain strong daily and sub-daily behaviour indicating that the daily and sub-daily changes in the seismic velocity are primarily caused by the coupling of atmospheric processes and solid earth. We also note the influence of temperature changes on daily variations, but as a second-order effect. The explanatory model focuses on depth variations of the groundwater table, linking atmospheric pressure (loading and de-loading the Earth's surface) to variations in seismic velocity. Our results highlight an important environmental influence on seismic velocity that needs to be considered before seismic velocity variations can be used for inspecting fluid and stress variations in situ.
We present a new 3-D shear velocity model for the crust-uppermost mantle structure beneath the Caribbean region from the surface down to 150 km depth. Our velocity model was derived from joint inversion of group and phase velocity dispersion data obtained from ambient noise and earthquake data. The group and phase dispersion curves estimated from ambient noise were calculated from cross-correlation using up to four years of continuous data. Perturbations in group and phase surfaces wave velocities within a resolution of 1x1 degrees show the relevant geotectonic units in the Caribbean plate. Plate boundaries, ocean basins, rises, rifts and microplates are well defined by shear wave velocity impedances. The 3-D shear wave velocity inversion along profiles shows the thickening of the crust from the ocean to continental margins. We present a new Moho interface map with depths undulating between 11 km and 17 km beneath most parts of the sea and 25 km to 45 km below the continental areas. Low velocity zones were found in the uppermost mantle indicating a highly laterally heterogeneous area.
This study aims to estimate earthquake magnitude quickly using vertical component data from Broadband seismographs for the improvement of tsunami warnings. Empirical relationships were obtained for displacement and integrated displacement with moment magnitude. Data selection of seismic records was done before estimating the relationships of amplitudes by observing the number of amount of data to unselect records with gaps and the maximum amplitude to eliminate spike data. We compared the magnitudes of the formulas for displacement (MD) and integrated displacement (MID). We found that MID yielded better estimates than MD. MID also produced appropriate estimates for earthquakes with strike-slip focal mechanisms. On the other hand, for deep earthquakes, MD yielded a better estimate. In the case of the 2010 Mentawai tsunami earthquake, MID produced an underestimated whereas the estimate was obtained no more than three minutes after the earthquake origin time.
Ghana is positioned far from any active plate boundaries at the southeastern part of the West African Craton. Elmina, Cape Coast, Saltpond, and Winneba are the four main towns located along the coast of Central region, being one of Ghana's sixteen regions. The seismicity in the regions are stable, with the exception of Greater Accra, Eastern, and part of Volta region that are notable earthquake hotspots. The Central region has been subjected to damaging earthquakes since 1615, when an earthquake was felt in Elmina along the coast of Cape Coast. Tremors have recently been felt by residents of the Cape Coast, specifically at the university Cape Coast campus. Though there were no recordings at the Ghana Geological Survey to support it, verbal communication and observation of some of their structures point to seismic activity in the area. Moreover, a compilation of data from the International Data Centre of the Comprehensive Nuclear Test Ban Treaty Organization reveals the recording of seismic occurrences in the Gulf of Guinea along the coast of Ghana, with the intensity being felt on land. The measured local magnitude (ML) ranges from 2.0 to 4.1, and the body magnitude (Mb) ranges from 2.0 to 4.6.
Seismic moment tensor (MT) inversions have become an important method for characterizing source type (e.g. earthquake, explosion, collapse), seismic moment and depth. However, these methods commonly use average plane-layered one-dimensional (1-D) Green’s functions (GF’s). For areas where path-specific structure is complex, 1-D GF’s cannot fit observed waveforms and source parameters from MT inversions are poorly resolved. These problems are compounded for shorter periods and/or longer paths where structural effects accumulate. We are developing improved three-dimensional (3-D) models on a regional- and continental-scale using Adjoint Waveform Tomography. These models provide improved simulations of regional distance waveforms compared with 1-D models, particularly at far-regional distances, say > 1000 km. MT inversions using 3-D GF’s show greater variance reductions (improved waveform fit) and smaller phase delays between observed and simulated waveforms. We have developed models for the western United States where we have good data coverage for the tomography and ample source types to evaluate MT inversion performance. Results for other regions and will be presented at the conference. We are particularly attuned to MT inversions in tectonically complex parts of the world with sparse regional observations, such as the International Monitoring System.
Aswan is a very important city located in the southern part of Egypt, with several cultural heritage sites and critical infrastructure, including the High Dam. The High Dam is located near one of the most seismotectonically active zones, the Kalabsha fault zone. This zone witnessed the strongest instrumental earthquake in Aswan in November 1981, with Mw = 5.8. The current study aims to study the seismotectonic setting of Aswan depending on (1) Updating the fault plane solutions catalogue by constructing the Focal Mechanism Solution (FMS) for the earthquakes that happened in the vicinity of Aswan from 2012 to 2022 (2) stress tensor inversion using the FMS. Also, it was known that Aswan suffered from two historical earthquakes along the Kalabsha fault. In this study, Coulomb’s stress change was estimated for these earthquakes to determine if they occurred along the Kalabsha fault and their relation to the 1981 earthquake.
Java Island is an area with high earthquake activity, one of these activities is caused by the impact of the Indo-Australian Plate, which hits the Eurasian Plate, causing a subduction zone along Java Island. Besides the subduction route, Java Island also has earthquake sources from active faults. This study aims to image tectonic patterns based on the structure of wave velocity models. The data used are the travel time of P waves from 9238 earthquakes from January 2009 to December 2020 captured by 128 seismic sensor networks. The initial velocity model used in this study is the one-dimensional (1-D) global velocity model AK135. For the simultaneous inversion process, we used SIMULPS12 code. The results of the tomographic inversion show several tectonic patterns. In addition, earthquakes were detected due to volcanic activity and shallow earthquakes associated with the fault line. The horizontal tomogram activity successfully depicts the distribution of volcanic structure under the volcanic array in the study area. The vertical tomogram successfully depicts a slab field that sharpens at a depth of <250 km and a partial melting structure under the volcano.
Sabah is the most seismically active state in Malaysia where it has recorded higher number of moderate seismological activities for the past decades. The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, which are Ranau in Kota Kinabalu and Lahad Datu in the southeast of Sabah. The International Monitoring System (IMS) network set-up by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has successfully detected seismic events that occurred in Sabah for the past decades. This paper aims at quantifying the recurrence periods and probabilities of occurrence of earthquake in Sabah using the IMS seismic data. The Extreme Value Distribution Type-I has been applied in this study to evaluate the maximum magnitude data, where the results of analysis have enabled the quantification of recurrence periods and probabilities of occurrence at any given earthquake magnitude. Consequently, the findings from this study could be utilized to further assess the impact of seismic events that are certainly useful to assist relevant entities in planning for disaster management in Sabah.
Geophysical data has been gathered to assess these data sets to achieve reliable earthquake re-localisation. These data would serve as input for the Regional Seismic Travel Time (RSTT ) global model's workflow to achieve accurate RSTT in Venezuela and the Venezuelan part of the Caribbean Sea. Therefore, the data sets are extensive including the compilation of earthquakes, mine blasts and seismic data shots from various seismic projects that have taken place in the last decades, which would serve as a significant improvement in the RSTT model and the International Monitoring System (IMS ) data for re-localization, proposed in the area region. Most of these data sets have served in the past for geological characterization and basin analysis within the oil exploration industry, geohazard assessment, and local velocity model building for a better understanding of seismic risk. However, proper data integration would allow a significant improvement, given the previous efforts made in data integration, to achieve consistent Moho depths that best describe addressing the lateral variations and heterogeneities within the Venezuelan crust, which would benefit improving seismic phases picking for the National Data Center and IMS data.
The purpose of this study is to improve the hazard model and the accuracy of the Romanian earthquake catalogue (ROMPLUS) by determining the one-dimensional (1-D) velocity model in a complex collision setting at the bending of Southern Carpathians. To achieve this aim, we investigate 314 low to moderate-sized local, crustal events (Mw 1.0 - 3.4) that occurred from 2019 to 2021. The waveforms were recorded by the permanent stations of the Romanian Seismic Network (RSN) over a distance of up to 200 km. The P and S wave travel times determined in the selected distance range were inverted using the VELEST algorithm and the IASP91 velocity reference model. The resulting high precision 1-D velocity model and the station corrections were further used to relocate the seismic events recorded in ROMPLUS catalogue within the selected area. Our results provide a significant decrease in RMS location errors and, in agreement with previous findings, show a good correlation with local geological structure. Consequently, the relocated events place key constraints on natural seismicity patterns, representing the first step towards more detailed seismotectonic analyses.
Romania's participation in the global system of verification of nuclear experiments by seismological means is carried out within the National Institute for Earth Physics by hosting the National Data Centre of Romania (ROM-NDC) and operating and maintaining the auxiliary seismic station Muntele Rosu (AS081, MLR) as part of International Monitoring System (IMS). The main purposes of this study are first, to compare the reviewed seismic bulletins (REBs) produced by ROM-NDC with the REBs produced by International National Data Centre (IDC) and second, to highlight the use of the MLR station in IDC products for local and regional earthquakes as well as in the ISC bulletins on a global scale. In order to achieve these goals, we are analysing seismic events with a magnitude ML above 3.5, that occurred in 2021 on Romania`s territory and surrounding areas, by applying both statistical methods as well as processing and comparing the data recorded by the Romanian Seismic Network and the IMS network for these events using the Antelope and Geotool softwares. The comparison between IDC and ROM-NDC bulletins shows that an important percentage of such events are commonly missed in IDC REB solutions for the events with a magnitude ML above 3.5 (approx. 50%).
Macro seismic intensities are a key factor to support the seismic hypocenter location, moreover, this kind of information contributes to sample the Mercalli Modified Intensities (MMI) felt in each city, which can vary along the earthquake ray path. In our country the MMI has been collected since the OSC-NDC foundation, however, over the last 20 years this task was discontinued due to the lack of a collection protocol. To resolve this issue, since 2018, we re-did an intensity form so people would fill it in. As a result, we produced visually appealing questions and provided check box options. As the Covid-19 pandemic started and forced institutions to migrate to digital and teleworking procedures, we implemented a digital form to fill in through tested information technologies, such as, WhatsAPP and Google Forms. MMI computation output’s goes to a database from where the expert can draw intensities curves and evaluate the effect along a city. The latest results proved that the new digital form works efficiently and enhances the maps along the region that felt the earthquake.
This research presents seismic studies review through history up to contemporary days to understand the behavior of earthquake occurrences in Azerbaijan and the interrelation between geodynamics and seismicity. Azerbaijan is situated within the central part of the Mediterranean active belt with seismicity stipulated by intensive geodynamic interrelation of Eurasian and Arabic lithosphere plates. Azerbaijan is characterized by high seismicity with a great number of historical strong earthquakes with М≥6. Developments in seismological instrumentation and power systems have allowed increasing the number of seismic stations in the country since the 2000s for high quality as well as cost effective seismological observations. Great efforts have been directed towards developing probabilistic and deterministic seismic hazard assessment standards in Azerbaijan, plotting analytical models, graphs, maps and understanding their implications for seismic hazard and risk assessments. Rising number of strong and damaging earthquakes in the world, growth of data and knowledge in line with new engineering demands and needs for the operational safety for community initiated the research of more effective approaches incorporating new multidisciplinary concepts and integrated tools. Challenges in exploring recent innovative developments in computational techniques, numerical modeling of classical and new seismology problems is further ahead in Azerbaijan.
We report on advances in capabilities toward detailed three-dimensional (3-D) density evaluation of shallow underground features. We are currently developing software to use a 3-D Geological Framework Model (GFM) to predict gravitational anomalies from underground structures such as faults, cavities or tunnels that might be detected with surface and/or subsurface gravity measurements. This software is being built within a Python operating environment and is leveraging parallelization capabilities of Python to accelerate the computations for large models (billions of elements) parameterized at high resolution. The models ingested by the algorithm are built upon a grid of right rectangular prisms with associated densities. The gravitational contribution of each prism is estimated using the method of Nagy (1966) for each measurement position in a survey. Because the gravitational field is a linear sum of the independent prisms for each station, the problem is well-suited to parallelization. The goal of our development is to build an inverse capability which would allow for iterative model adaptation based upon measured gravity in an area of interest. We present the status of this tool and its application to gravity survey locations at the Nevada National Security Site. LA-UR-22-31036.
Seismic hazard assessment for Madagascar based on probabilistic analysis method (PSHA) is carried out after determining the three main parameters such as the mean seismic activity rate (λ), the b-value called Gutenberg-Richter value, and the maximum magnitude mmax. An earthquake catalogue was compiled from data combined between two sets of bulletins by the MACOMO and National Data Centre databases. Duplicate events were removed and the catalogue was homogenized to moment magnitude (MW) scale before being declustered. A seismotectonic model for Madagascar developed from latter studies was used for the delineation of seismic source zones. A total of seven areal source zones were introduced in this study. Each zone is characterized in terms of its recurrence parameters and maximum magnitude using the homogenized catalogue. Seismic hazard calculations were performed for a grid spacing of 0.5o x 0.5o throughout the country. The logic tree formalization was implemented to account for uncertainties in the input parameters. The hazard values from 10% and 2% probabilities of exceedance for 50 years are estimated with the spectral accelerations for periods 1.0 s and 3.0 s. Relatively high hazard values were observed within the central regions of Madagascar comparing results from previous studies.
The current manuscript comes to answer a controversial question: Is the local earthquake activity in Kuwait a result of tectonic circumstances or anthropogenic? Kuwait is an oil-producing country from which oil is extracted in many places and earthquakes occur simultaneously with the extraction of oil. Seismic activity inside Kuwait is monitored with high accuracy and continuously after the establishment of the Kuwait National Seismic Network (KNSN) in 1997 through an ambitious plan to study the micro-earthquake activity in Kuwait. The KNSN network recorded more than 1000 micro/minor local earthquakes. Two areas were distinguished in Kuwait in which earthquakes only occur. The recorded earthquakes in those areas are characterized by small magnitudes and shallow focal depths. In the current work, several modern geophysical techniques including waveform inversion, Double-Couple and Compensated Linear Vector Dipole signatures, stress drop values, and stress patterns have been used to distinguish between types of seismicity of different origins. The results obtained clearly show that the earthquakes that occur in Kuwait involve tectonic and anthropogenic components. This means that the local earthquakes in Kuwait occur in places that have active network faults and are triggered by the oil extraction in these places alone.
Crustal rocks containing cracks and other internal flaws are characterized by nonlinear elasticity, a phenomenon that can be observed in laboratory experiments under applied stress. Stress sensitivity of elastic moduli is associated with opening/closing of cracks under applied stress. In this study we emulated rock laboratory protocols by measuring elastic wave speed using empirical Green’s functions (probe) at different points of earth tidal strain cycle (pump) in natural pump-probe experiment to infer the orientation of SHmax in the crust, independently from borehole measurement and earthquake mechanisms. We validated the approach using large data set in the Northern Alpine Foreland region where SHmax orientation is well-known. The approach is then applied to the Central Eastern Alps to understand contemporary stress pattern. We confirm that the method can be applied in large scale seismic arrays. For the validation area it resolves NNW-SSW to N-S directed SHmax which agree with conventional methods and recent crustal stress model. Furthermore, our results show rotation of SHmax orientation from NW-SE in the southwest of Central Eastern Alps to N-S in the east of the Central Eastern Alps. The approach can also be applied to nuclear test sites, provided that there are continuous seismic stations in the region.
The recent strong densification of seismic networks and the increase of data availability from different type of sensors/networks (low-cost, Raspberry-shake, temporary deployments with hundreds of nodes, etc.) questions our ability to automatically and accurately detect, locate and characterize seismic events combining these composite and large data sets. These steps are however key issues to monitor natural and anthropogenic seismic activity, an important goal of the CTBTO community. Here we present a protocol/workflow that takes into account the latest advances in deep learning methods for seismology, aiming to automatically build seismic bulletins and discriminate/label events, for national needs as for scientific research. We validate this protocol/workflow and proved its robustness to different sets of data, from the local scale of a dense seismic network (200 nodes), to a regional scale of a temporary broadband network completed by Raspberry-shakes (Pyrenees and Bolivian Altiplano). We show the gain reached both in terms of calculation time and hypocentral location accuracy. The increase in accounted phase picks due to joint use of permanent/ temporary/low-cost sensors, and automatic deep learning based approaches, provides more exhaustive and accurate seismic bulletins, leading to a better characterization of the regional seismicity and allow the construction of trustable natural and anthropogenic seismic events database.
The primary seismic station PS14-ROSC is located near the town of Rosal in the central part of the eastern cordillera, close to the Colombian National Seismological Network headquarters. The signal to noise ratio of seismic data shows low quality data. Using the power spectral densities (PSD), we analysed changes in the seismic background noise levels in the station. We integrated microtremor measurements with geological observations to characterize the local site effects of the station and surrounding area in order to find an optimal place to relocate the bunker and improve the quality of the data. We conducted miniature microtremor array measurements to estimate the S wave velocity structure beneath the studied area. The spectral relationship of the horizontal components with the vertical H/V allows us to estimate the fundamental period of the ground. We calculated the phase velocity dispersion curves of Rayleigh waves using the methodologies SPAC, CCA, NC-CCA and we inferred the shear wave velocity profiles Vs by inversion technique.
Seismic data collected by the Costa Rican OVSICORI-UNA seismic network is used to study the spatiotemporal seismicity and tectonic, related with the Mw 6.7, 21 July 21:15:07, 2021 earthquake, located in the Panama Fracture Zone, 114 km south of the Burica Peninsula. In this zone the relative motion of the Cocos and Nazca plates, accommodates the stress mainly by right-lateral strike-slip motion along the Panama Fracture System as supported by relocated fore and aftershocks distribution and calculated moment tensor nodal plane of the main event (strike=186º, dip=88º, rake=-152º). The rupture evolution started from south of epicenter, moving north toward mainland, rupturing over 45 km in length and 30 km width, reaching a maximum slip of 0.6 m.
The conventional picture of the seismicity in Finland, and the rest of Fennoscandia, has been that earthquakes are mostly individual or doublet events without fore or aftershocks. This notion of apparent singularity is based on automatic detection and classification tools and manual analysis workflow designed for regional events visible on multiple stations over a relatively sparse seismic network. On the other hand, in Wyborg rapakivi batholith, located in south-east Finland, earthquakes have been known to exhibit a more swarm type behaviour, where a single larger – ML 1.0 to 3.0 – event can be surrounded by tens if not hundreds of smaller events above ML 0.0. In this study, we apply a cross-correlation based event detection suite, developed for a dense array based seismic network monitoring enhanced geothermal system in the Helsinki capital region, to various Fennoscandian earthquakes. Using a template created from a selected earthquake, the suite is able to identify smaller seismic events with a location and source mechanism similar to the template event. In areas with a dense station coverage, we can detect events down to ~ML -1.0. This provides us with an improved image on the nature of microseismicity related to the Fennoscandian earthquakes.
Data collected from IRIS Reviewed Bulletin for East Africa is used to generate empirical regression equations using orthogonal least square regression (OLSR). The regression best-fit line yields a regression equation y=0.517x+ 2.1787 with R2 0.6208 for 972 events with a cut-off magnitude of 2.5. While the R2 error is relatively high the result compare published regression equations. The slight variations may be due to the IDC body wave calculation method. The equation bridges the data gap where no other magnitude type is provided and allows for the extension of the seismic catalogue by 22 events not reported by any other network. Further, it helps increase our understanding of the seismicity of the region and contributes to more representative hazard mapping.
Seismic tomograms of the uppermost subcontinental lithosphere (USCL) beneath southern Africa are key to improving knowledge of the correlation between the surface geology and velocity anomalies in the region. The regional distribution of seismic wavespeed anomalies (SWAs) provides a means to delineate known structural features and to find new regions of SWAs within the model space. Delineated SWAs enable a better understanding of the relationship between SWAs and the USCL density and thermal variations. The present study investigates the anatomy of the USCL beneath southern Africa by tomographic inversion of absolute P wave arrival times from local, regional, and mining-induced earthquakes recorded by 82 broadband stations of the 1997–1999 Southern Africa Seismic Experiment (SASE) and three seismic stations of the International Monitoring System (IMS) located in the study area. The P wave geotomograms were determined through the application of a hybrid iterative tomographic inversion method in which travel times and ray paths are calculated rapidly and accurately using a 3-D ray tracer, and the linearized iterative inversion utilizes the conjugate gradient-type LSQR algorithm. The reliability of the SWAs was assessed through checkerboard resolution test. The geotomograms determined in the present study indicate that the P wave speed structure of the USCL is heterogeneous across southern Africa.
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
After opening for signature of the CTBT (24 September 1996), the PTS began working on the establishment of a global verification regime to monitor its compliance (17 March 1997). The verification regime includes both non-technical and technical elements (defined in Article IV of the treaty); while non-technical refer to diplomatic actions and confidence building measures within States Parties, the latter involves the construction of a complex system built on science, strategy and technology that is divided into three large groups (teams) gradually establishing and operating (in provisional mode) the IMS, the IDC and the OSI. This presentation is a historical collection of highlights on the development of the IDC to date.
Keywords: PTS Provisional Technical Secretariat, IMS International Monitoring System, IDC International Data Centre, OSI On-Site Inspection.
In 2001, the first data from an IMS infrasound station arrived at the IDC. The IDC infrasound processing system was in a premature state and a multi-year development effort was initiated, which led to the completion in 2010 to the first operational infrasound automatic processing and interactive analysis systems. In thirteen years the IDC produced over 45,000 infrasound events reviewed by expert analysts.
In an effort to continue advancing methods, improving its automatic system and providing software packages to authorized users, the IDC focused on redesigning the station processing and interactive review systems. This lead to the development of DTK-(G)PMCC promoted to IDC operational environment on 01 July 2022. The software package is available in NDC-in-a-Box (NiaB) to authorized users.
In complement, an infrasound model was developed for the automatic waveform network processing software, NET-VISA, with an emphasis on the optimization of the network detection threshold by identifying ways to refine signal characterization methodology and association criteria.
Those major endeavours allow for the enhancement of the IDC processing system, to reduce workload of analysts and to continuously increase the quality of IDC bulletins. Progresses will be illustrated with notable infrasound events since promotion of the new Operational software.
Calibration activities for infrasound stations of the International Monitoring System (IMS) are a requirement of the Operational Manual for Infrasound Monitoring and the International Exchange of Infrasound Data. Until 2011, significant technical and scientific challenges prevented full compliance with these requirements. This included the absence of available techniques to characterize responses in operational conditions. To address these challenges, the PTS coordinates the development of a detailed infrasound technology roadmap, and works together with the scientific, metrology, and sensor manufacturing communities. This collaboration led to the deployment of calibration capabilities at infrasound stations of the IMS, starting with a first station in 2015. This presentation illustrates the Scheduled Calibration process as currently defined at the PTS for infrasound stations, past progress, and current challenges. As of 2022, calibration capabilities have been rolled out successfully at 34% of the certified infrasound IMS stations (18 stations). A dedicated software (CalxPy) was also designed to support the calibration process for all stations having calibration capability in a centralized manner in Vienna. These calibration results are reported since 2019 to Member States.
In 1993, a shallow earthquake sequence occurred in Rock Valley, Nevada National Security Site. The largest event, M3.7, was followed by eleven M>2 events ranging in depth from 1-3 km. All events were well constrained due to the deployment of stations early in the sequence. Comparison of these shallow events to nearby historic nuclear tests identified gaps in our ability to discriminate these event types. To answer the question about the physics of shallow earthquakes, the Rock Valley Direct Comparison Experiment was conceived. We will conduct two chemical explosions at similar hypocenters to the earthquake sequence to understand the discrimination features between these types of events. We have systematically relocated the 1993 events, varying velocity models and codes, but using a common pick data set to choose the experiment borehole. Three additional boreholes are planned and will be instrumented to obtain microseismicity data, sample fault properties, and record the chemical explosions. We are installing a dense seismic network, including re-occupying stations that recorded the 1993 earthquakes. We have developed a 3-D geologic framework model for visualization, and for modeling and simulation efforts. We expect this unprecedented data set will address seismic waveform differences between earthquake and explosion sources.
The Source Physics Experiments (SPE) are a series of controlled chemical explosions at the Nevada National Security Site to gather observations, validate physics-based numerical models and understand the genesis of shear waves to improve nuclear discrimination and monitoring capabilities. Executed between 2011 and 2016, Phase I of SPE encompassed six collocated chemical explosions executed in hard granite with different yields at different depths. Phase II included four chemical explosions executed in 2018 and 2019 in soft dry alluvium. Phase III, however, includes two planned chemical explosions in a dominant dolomite geology and collocated with a 1993 shallow earthquake. LLNL has developed a comprehensive numerical framework to simulate from source-to-receivers, the waves generated from the non-linear explosion source-region to linear-elastic seismoacoustic distances. We present the analysis of SPE Phases I & II collected data, summarize how modeling predictions compared to observed data and draw lessons learned. We share insight on the main mechanisms of generating shear motions in granite, alluvium and dolomite. Moreover, we developed schemes of uncertainty propagation of the geological characterization and geophysical parameters. We present impacts of those uncertainties on designing of Phase III tests, predicting the near-field responses of planned tests, and enhancing source discrimination.
On 18 May 2020, a seismic event with mb 4.7 and MS 3.9 (International Data Centre) occurred in Kiruna, northern Sweden, at an active iron mine (Luossavaara-Kiirunavaara AB, LKAB). This event was widely observed at broadband seismic stations throughout Nordic countries and as far away as Iceland, Greenland, the UK and continental Europe. Both the operating mining company and various seismological agencies have reported the event as induced or anthropogenic. However, screening procedures at the International Data Centre reported an MS/mb score of -2.03, and as a result the event could not be screened out as non-nuclear in origin. Using waveform data from all possible International Monitoring System stations and openly available seismic stations, we estimate moment tensors with uncertainties for this and other events using the methodology of Alvizuri et al (2018) which has proven successful in characterizing and screening various other events, including the six nuclear tests in the Democratic People's Republic of Korea and a collapse event eight minutes after the 2017 nuclear test at the same location. The results for the nuclear tests reveal mechanisms with positive isotropic parameters, while collapse events as observed at Kiruna and the Democratic People's Republic of Korea reveal negative isotropic parameters, which suggest that future screening criteria would benefit from using moment tensors and source-types.
Long term infrasound event bulletins are useful for identifying repeating sources from a common location, quantifying source characteristics and studying the time-varying nature of the atmosphere. We produce a regional infrasound bulletin in the Korean peninsula region for the time period from 1999 to 2021. We use data from six infrasound arrays in South Korea, cooperatively operated by Southern Methodist University and Korea Institute of Geoscience and Mineral Resources, and from two International Monitoring System infrasound stations in Russia and Japan. The detection procedure uses an adaptive F-detector that inputs arrival time and back azimuth into the Bayesian Infrasonic Source Location procedure. The bulletin consists of 34 218 events spanning over 23 years and produces locations indicative of repeated events from many different source types, including shallow-depth earthquakes, explosions from limestone mines and quarry operations. Most events occur during working hours and days, suggesting a human cause. We quantified the false association using the perturbed detection time simulation and reviewed the events with many associated arrays. The ray tracing using the Ground-to-Space atmospheric model generally predicts the infrasound arrivals when strong stratospheric wind exists, while local weather data can contribute to explaining the propagation path effects to arrays in some cases.
The São Jorge seismovolcanic crisis in 2022 provided an opportunity to deploy a portable infrasound array (SJ1) on the island, in a collaborative work between the University of the Azores (UAc) and the University of Florence (UniFI). This four-element array became operational on 2 April 2022 firstly with a diamond geometry, and after 3 May 2022 with a centred triangular design. Therefore, SJ1 in association with the International Monitoring System infrasound station IS42, located on Graciosa Island at ~41 km distance, formed a temporary monitoring network, aiming to assist the monitoring activities related with the volcanic unrest in São Jorge Island. Both have different equipment, mainly regarding on sensor's type: SJ1 is composed of four differential pressure transducers and the signal is digitized at 100 sps, while IS42 is formed by eight absolute microbarometers rearranged into a three-element triangular small aperture within a five-element pentagonal aperture and signal digitized at 20 sps. We present here two examples of detections registered: (1) seismoacoustic signals associated to a low magnitude earthquake at São Jorge, and (2) a meteor offshore north of São Miguel Island, allowing to validate the overall detection capability and the importance of this type of solution to monitor local seismovolcanic activity.
Atmospheric concentrations of radionuclides measured at International Monitoring System (IMS) stations can vary by an order of magnitude or more between neighboring stations and from one collection to the next due to changes in weather, emission rates of background sources and other local factors. Large spatiotemporal changes make it difficult to categorize elevated collections as anomalies of interest or something benign. Simple univariate quantile scores are easy to calculate from time series of individual radionuclides at single IMS stations, but they do not account for correlated anomalies across different collections or different isotopes within a collection. In this presentation, we demonstrate the advantages of using machine learning algorithms routinely used to detect fraudulent activity in the financial industry for identifying radionuclide anomalies. Unsupervised machine learning anomaly detection algorithms, such as isolation forests and local outlier factors, can provide quantitative scores that consider a multitude of information. These algorithms run quickly, are easy to train, and, with relatively little effort, could be automated to process streaming IMS data. We also demonstrate supervised machine learning approaches, such as Bayesian ridge regression, to improve radionuclide anomaly detection through the incorporation of simulated signals from one or more atmospheric transport models.
The Forensic Radionuclide Event Analysis and Reconstruction (FREAR) tool performs source reconstruction using a Bayesian inference framework. Using a statistical model of the radionuclide detection system and atmospheric transport models, FREAR infers and reconstructs the best emission source characteristics to fit the supplied observations. This powerful approach provides assessments of source location, release magnitude, release period, compares observed and posterior predicted measurements and ultimately reduces the area of consideration of potential sources.
To test the suitability of FREAR for verification applications, blind synthetic trials were constructed at two different environmental scales, using WRF-HYSPLIT. These two scenarios were then reconstructed by FREAR at three different organizations using different dispersion and meteorology model input (NCEP-HYSPLIT, ECMWF-FLEXPART, CMC-MLDP). The results of the two trials will be discussed, showing that FREAR represents a dramatic improvement in radionuclide event assessment, providing a valuable tool to National Data Centres. The future development of the FREAR tool will be shared.
The International Monitoring System operates a network of radionuclide technology monitoring stations. Historically, each measurement has been analysed separately to verify treaty compliance. Combining sample measurements, including detects and non-detects, invokes greater monitoring capability referred to as network power. A key steppingstone to obtaining network power is to objectively select a group of related sample measurements that are associated with a release event. Such collections of measurements can be assembled by an analyst, or perhaps they can be selected by algorithm. The authors explore, using a year of atmospheric transport calculations and realistic sensor sensitivities, the potential for a computed radionuclide association tool.
Inverse modelling allows to determine source parameters of an atmospheric release of radioactivity by combining atmospheric transport modelling and airborne radionuclide measurements in a statistically coherent manner. A series of inverse modelling experiments will be conducted using real-world Xe-133 observations from the International Monitoring System and the Forensic Radionuclide Event Analysis and Reconstruction tool to determine the (known) source parameters of the former medical isotope production facility CNL. The purpose of these experiments is to establish a baseline for source reconstruction from which new settings, new data, and/or new methods can be easily tested. Testing can include, but is not limited to: (i) different source parameterizations, (ii) different optimization formalisms, (iii) applying background-corrected Xe-133 observations and (iv) different atmospheric transport model input. In this study, the inverse modelling baseline will be presented together with a series of tests. A set of metrics for evaluating inverse modelling will be explored which will allow us to compare the performance of experiments with the baseline.
In the context of CTBT verification, radioxenon ratios can be used to discriminate between nuclear weapons domain signatures and nuclear facilities as plausible sources of the observed atmospheric radioactivity. This discrimination is accomplished by setting a screening flag threshold beyond which a radioxenon sample can be considered as being possibly of relevance from a nuclear explosion monitoring perspective. The goal of this study was to evaluate the currently implemented method for setting screening flag thresholds and to explore potential enhancements. First, a data set was analysed comprising approximately 35 000 samples of International Monitoring System (IMS) radioxenon data. Using activity concentrations and reported uncertainties, the ratios Xe-135/Xe-133, Xe-133m/Xe-133, Xe-133m/Xe-131m and Xe-133/Xe-131m were quantified for each IMS sample using two different methods for handling uncertainty and three different confidence intervals. The sets of obtained ratios were then used to determine new thresholds based on three different false positive rates. To test the performance of each screening flag, signatures of a hypothetical explosion were combined with observations from IMS samples and the resulting radioxenon ratios were compared to the newly established threshold. The detection rate for each of the 18 screening flag threshold calculation methods was quantified to enable the selection of the most suitable method.
Sensor data from radionuclide systems within the International Monitoring System (IMS) provides critical information about the operating status of a given station. Many of the algorithms currently in place monitor large immediate sensor deviations and provide alerts. Small sensor changes, especially over extended periods of time, are more challenging to detect. Another challenge is quickly identifying the root cause of failures automatically such that strategic maintenance can be performed. Pacific Northwest National Laboratory (PNNL), in collaboration with General Dynamics, has been developing an architecture for monitoring IMS radionuclide systems that is capable of testing new algorithms. One of the new techniques being investigated is classifying failures based on the sensor data available. The goal is to use knowledge from experts together with new data science techniques (such as artificial intelligence and machine learning) to determine the failure mode using groups of sensors as a signature. This presentation will discuss the current state of health monitoring architecture and the latest developments in the failure mode sensor signature algorithm development.
Modernization of technology and work processes during the last decade at NORSAR. How technology, state of health (SOH) monitoring and efficient work processes can contribute to quality, data availability and increased performance. In the last few years NORSAR has established a modernized SOH system based on Nagios, PostgreSQL and Grafana to monitor and log uptime and SOH of our installations and processes. An increased focus on risk mitigation and remote control capabilities has, together with the improved SOH tools, meant more efficient operations and higher uptime. Through risk analysis NORSAR has identified some of the challenges of the future. This presentation will give an overview of NORSARs experiences in lifecycle management and SOH monitoring system for efficient operations and sustainment of large International Monitoring System array stations. It will also look at the financial aspects and challenges of long term sustainment of large scale infrastructure and potential high impact risk factors within operation and sustainment of such stations.
The auxiliary seismic network of the International Monitoring System (IMS) is comprised of 120 seismic stations. It is designed to support the primary seismic network to increase detection accuracy, for earthquake monitoring and nuclear test detection. Some auxiliary stations have been designated as temporary substitute primary stations. Auxiliary seismic stations are deployed at remote areas around the globe, many are hard to access often in harsh climatic conditions and occasionally facing security issues; some are approaching or exceeded 15 years of age since certification. Thus equipment replacement must be considered and failure, obsolescence and replacements must be addressed with greater frequency. The Provisional Technical Secretariat (PTS) provides remote technical support for troubleshooting, training of station operators, implementation of authentication and data transport, in accordance with the terms of the CTBT Treaty (Article IV). However, support is limited if the resulting action requires financial expenditure from the PTS. The direct consequence of this limitation is amplified if the host country is not capable of meeting the maintenance costs, thus resulting in station outages sometimes prolonged. We discuss the performance of the auxiliary station network and recommend solutions for the short and long term to improve maintenance practices and achieve better results.
The CTBT-IMS global sensor network comprises three waveform technologies: Seismic, hydroacoustic, and infrasound, designed to monitor the world continuously for any nuclear explosions on the ground, in the ocean, and the atmosphere. Waveform signals at CTBT-IMS stations can present complex arrival characteristics caused by 3D features along their long-range propagation paths. This panel will debate the challenges in 3D modelling of long-range sound propagation in the ocean and the atmosphere. The panelists will discuss recent advances in 3D modelling of seismic-to-acoustic and acoustic-to-seismic energy conversion at the ocean, land, and atmosphere interfaces.
We have been considering spatio-temporal variations of S wave attenuation field structure in the region of the Semipalatinsk test site (STS). We studied variations of amplitude ratio for Lg and Pg waves (parameter Lg/Pg) using seismograms of underground nuclear explosions (UNEs) and also calibration and quarry explosions, obtained by stations TLG and MKAR. We revealed essential temporal variations of parameter Lg/Pg for the UNEs at three main sites of the STS. At the end of 1980s mean Lg/Pg values are considerably lower for the Balapan site in comparison with two other sites. We have studied characteristics of S coda envelopes using recordings of the chemical explosions, obtained by close stations on the STS territory. The data obtained show that minimal quality values for temporal stations in the STS region (~40-55 for frequency 1.25 Hz) are considerably lower than in the seismically active North Tien Shan region (~60-80). It is supposed that the spatio-temporal variations of the attenuation field structure in the STS region are connected with deep-seated fluids migration in the earth’s crust and uppermost mantle, stipulated by long term intensive UNEs influence on the geological medium.
Historical geophysical data recorded during the peak of nuclear testing is rare and limited. Efforts have been made to preserve and digitize data but have minimal quality control from decades of lost history and retiring personnel. Furthermore, recent research into the subsequent collapses of cavities is stifled by the lack of continuous records to investigate historical collapses.
The Livermore National Network (LNN) was a 4-station seismic network in California, Nevada and Utah that recorded nuclear and tectonic events starting in the 1960s. Here, we present previously unreleased data from LNN containing over 100 recorded nuclear tests as well as over 50 collapses associated with a nuclear test. We will discuss the challenges and mitigation efforts we undertook to preserve and correct any errors in the digitization, waveform rotation and metadata.
In recent decades, the tripartite micro arrays (i.e. three-element array) became a major tool for the passive seismic phase of the on-site inspection, mainly due to their superior back azimuth estimation. However, the back azimuth is estimated under the assumption of far field approximation while tripartite arrays are used to monitor micro-seismicity and aftershocks in the vacancy of the array. Thus, in the region where the far field assumption might not hold. In this work, we determine the effect of breaking the far field assumption by analyzing the plane wave errors, i.e. the errors of the back azimuth and slowness computations caused by the plane wave assumption. Computational formulas for estimating the absolute errors, due to the plane wave assumption, were developed. A case study utilizing the subarrays of the IMS station MMAI, demonstrate that the plane wave errors are not the theoretical issues only but taking them into account can improve the results of field measurements.
The Comprehensive Nuclear-Test-Ban Treaty language on an Inspected State Party’s right to portions of samples taken during an on-site inspection (OSI) is very similar to sampling language in the Chemical Weapons Convention (CWC), reflecting an overlap amongst negotiators involved. There are small differences in the treaties’ wordings, however, and in the case of the CTBT these differences have made it difficult for State Signatories to reach consensus on how to implement portioning of samples during an OSI. Meanwhile the CWC has been in force for 25 years, during which time the OPCW has developed procedures for inspectors to portion samples, though not all aspects of inspections or portioning have been fully exercised. Likewise, IAEA inspectors take samples during inspections and, while the concept of portioning may not be as explicit, procedures allow for the ISP to retain their own relevant samples. We examine how the ISP right to sample portions is practiced at the OPCW and IAEA and draw insights from their experiences to compare with the approaches considered at Preparatory Commission meetings.
The United States supports the Comprehensive Nuclear-Test-Ban Treaty (CTBT) and is committed to work to achieve its entry into force, recognizing the significant challenges that lie ahead in reaching this goal. Consistent with the goals of the CTBT, the United States continues to observe a moratorium on nuclear explosive testing and calls on all states possessing nuclear weapons to declare or maintain such a moratorium. The United States has no plans to conduct a nuclear explosive test.
This invited talk will provide examples of transparency related to activities at U.S. nuclear security enterprise sites such as the Nevada National Security Site (NNSS) and the national laboratories. The United States shares significant information about its plans and operations and is open with the international community about Stockpile Stewardship activities such as subcritical experiments and the recent fusion breakthrough as well as nonproliferation-related field experiments at NNSS.
Examples of U.S. support for the CTBT will also be discussed, including the recent acceptance of the next generation Xenon International noble gas analysis system for use in the International Monitoring System (IMS), IMS component testing, U.S. funding for the maintenance, operation, and improvement of its IMS stations, and extensive support for the International Data Centre Re engineering project.
Radio timing services failed for the first nuclear test (TRINITY, 16 July 1945), but were available to determine the origin time of an earlier 108 ton TNT explosion, conducted nearby on 7 May, 1945. We have scanned and digitized the vertical-component analog seismograms recorded at Tucson Observatory (TUO, at a distance of 437 km) for both events. These regional signals include Pn and Pg, and presumably Sn and Lg. We applied cross-correlation methods of analysis to the regional seismic window, finding that the faint signals of 7 May provide a satisfactory cross-correlation peak when compared with the 16 July signals. Our best estimate of TRINITY's origin time is 11:29:24.5 (GMT), good to a few tenths of a s. This result is significantly different from official reports. We give this specific result in the context of making three general points: (1) analog records are necessary to document the wide range of features of nuclear explosion seismograms; (2) modern cross-correlation methods of analysing seismograms can be effective and simple to use, in application to analog recordings of complicated weak regional seismic signals; and (3) there is merit in developing a complete list of basic parameters of historic nuclear test explosions.
Photogrammetry, a technique to create high resolution orthoimagery and digital elevation models from a collection of photos, has rapidly evolved since the 1990s. Traditional photogrammetric techniques assist in a wide variety of fields, including facilitating high resolution change detection for underground explosion monitoring and verification. Typically, this method involves the use of standard electro-optical imagery using visible band data. By repurposing and enhancing existing photogrammetric techniques and applying those techniques to thermal infrared data collected before and after a conventional explosive experiment, we developed a novel method for processing thermal images for signature detection. This thermal photogrammetry method provides a new way to detect post-explosion anomalies and artifacts, quantify extent of site changes, and characterize fragmentation materials and distribution from explosions. Additionally, this technique can be applied to data collected from ground based or airborne platforms, providing a novel tool to characterize larger sites We present these new methods for thermal data processing, discuss the results of a field campaign, highlight the successes of the technique, and define next steps for realizing data collection efficiencies and advancing data processing.
In the current age of technological innovation detecting illicit underground testing of nuclear weapons is a challenging task. However, remote sensing technology like unmanned aerial vehicles (UAV) is trying to provide solutions to some unresolved problems in our society and environment. UAV is a new and emerging technology that has a direct impact on the economy and society. It has a wide range of applications in many sectors like agriculture, insurance, energy and utilities, infrastructure, mining, media and entertainment. The technology is capable of capturing microscopic information with minute details and accuracy. UAVs can collect remote sensing data unhindered in adverse weather conditions even under clouds. UAVs can thus be used as surveillance technology in places where there is a threat of nuclear armament. The deployment of the technology also requires retaining values like accuracy, accountability, and efficiency. In innovations literature, the responsible innovation approach enables us to look into the question of accountability. Responsible innovation thus becomes significant as a theoretical framework. The study would address as research questions: How far technology like UAV can fulfill the objectives of CTBT? How can the framework of responsible innovation help in studying the issue of illicit nuclear testing?
Since 1875, the Metre Convention has established the principles for all nations to act in common accord in matters relating to units of measurement. In 1960 the system became known as the International System of Units (abbreviated SI from the French "Système International d'Unités"). Today, the SI is the widely accepted basis for the international mutual recognition of measurement results and provides confidence in measurement data traceable to this system.
This seemingly simple system of mutual recognition of measurement data across international borders and trading economies is based on the "CIPM-MRA", an international recognition arrangement under the CIPM, which came into force in 1999. Its implementation involves national metrology institutes and regional metrology organisations in a permanent infrastructure that ensures the international equivalence of measurements.
In the establishment and operation of such a complex network as the IMS, the CTBTO can realise significant benefits by linking to the SI and the associated metrology processes and rules that underpin this system. Recognising the potential benefits of linking to the SI, the CTBTO contacted the CIPM in 2017 and initiated a formal liaison in 2021.
This talk will present a history of the unification of the metric system under the Metre Convention, the development of a global network to enable CIPM-MRA, and some common technical highlights related to CIPM and CTBTO.
Any system that intends to provide reliable and trustworthy information demands quality processes that are commensurate with the complexity of the system and criticality of the data. The IMS is highly demanding in both respects. Therefore, the quality system infrastructure being developed and implemented for the IMS seismo-acoustic operations aims at realising these goals through unprejudiced operational procedures and objective data analysis.
This panel will consider the implications for the evolving quality system, building on the latest international developments in low-frequency sound and vibration metrology, such as the European Infra-AUV project.
Potential impacts and benefits include; improvements in the credibility of data through traceability to the International System of Units (SI) and knowledge of the associated uncertainty, and enhanced operational transparency and impartiality enabled by conformity assessment. The panel will consider the practicalities in implementing these innovations into IMS stations operations, alongside evolving best-practices across seismo-acoustic technologies, within the wider quality system developments.
Collectively, the panel members provide leading expertise in metrology and the provision of specialist measurement services, and in the vital operational considerations for the effective adoption of new developments. The panel will also address questions and issues that the broad spectrum of stakeholders may wish to raise.
The Redmond Salt Mine Monitoring Experiment in Utah was designed to record seismoacoustic data at distances less than 50 km for algorithm testing and development. During the experiment from October 2017 to July 2019, six broadband seismic stations were operating at a time, with three of them having fixed locations for the duration, while the three other stations were moved to different locations every one-and-half to two-and-half months. Redmond Salt Mine operations consist of nighttime underground blasting several times per week. Redmond Mine is located within a belt of active seismicity, allowing for easy comparison of natural and anthropogenic sources. Using the recorded dataset, we built 1373 events with local magnitude (Ml) of -2.4 and lower to 3.3. For 75 blasts from the Redmond Salt Mine (RMEs) and 206 tectonic earthquakes (EQs), both Ml and coda duration magnitude (Mc) are well constrained. To separate the population of RMEs from the group of EQs, we experimented with several discriminants, including the difference Ml-Mc, Rg/Sg spectral amplitude ratios, low frequency to high frequency Sg, and Pg/Sg amplitude ratios, and different combinations of two or more of these discriminants. The effectiveness of these discriminants at classifying the events is discussed.
An attempt is made to assess the quality of the Review Event Bulletin (REB) by comparing it to the Bulletin of the International Seismological Centre (ISC Bulletin). To compare, the ISC Bulletin events as downloaded from the ISC Web page for the month of October 2020 was used. The corresponding IDC REB events as listed in the ISC Bulletin were considered. During this period, a total of 3431 and 2691 events were considered for ISC and IDC, respectively. The comparison was performed using the information given for common events in the header lines of the ISC Bulletin from which matched events could be extracted. The result showed that a total of 2315 events could be matched between the two bulletins. The percentage of matched events with location difference (D) < 1o is about 95.3% while the percentage of events with D ≥ 5o is about 0.3%. There were two events with magnitude (mb) ≥ 4.0 and D ≥ 5o. The percentage of matched events with intersecting error ellipses is 57.0%. In general, the results obtained in this study show that there is a slight difference when compared to the results obtained for similar study made earlier.
Seismometer arrays form the core of the International Monitoring System (IMS) waveform network, and enhancing signal detector performance should lead to improvements in the performance of all subsequent parts of the International Data Centre (IDC) waveform processing pipeline. Recently a test data set was released by the IDC composed of signal detections made by an implementation of the generalized F detector (Selby 2008, 2011, 2013) at multiple IMS seismometer arrays. In this presentation we will cover the following related topics: i) the performance of the generalized F detector in this test data set in comparison with the existing signal detection algorithm used at the IDC; ii) if and how to apply f-k analysis to detections made by the generalized F detector, and iii) outline approaches to the physical characterization of signals observed at IMS arrays to enhance the performance of the generalized F or other detectors.
The International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization continues to develop the advanced automatic and interactive software NET-VISA, which uses state of the art machine learning and artificial intelligence techniques to do next generation automatic seismic event detector, based on Bayesian inference. The automatic seismic event bulletins it creates, which are called Standard Event Lists (SEL), are the first IDC products which would indicate the presence of a suspicious explosive incident; thus, the performance of the automatic event detector is key for building the capacity of verification regime. In the present study, we will discuss the results of testing the latest version of NET-VISA, which includes several newly developed features, such as the full pipeline configuration that represents the operational environment, and the incorporation of event screening criteria. The performance of NET-VISA as based on the review of a human analyst is discussed.
Seismic waves are sound waves emitted by, for example, an earthquake or an underground explosion. Accurate estimation of the back-azimuth, i.e. the direction of arrival (DOA), of a seismic signal to the detecting station, is required for the accurate localization of seismic events. This is especially important in the case of seismic arrays, which comprise a substantial portion of International Monitoring System primary stations. In this research, we develop a new low-complexity DOA estimation method, the iterative periodic Fisher’s scoring. In our simulations, we use synthetic data that was generated assuming the configuration of the MMAI seismic array and noise characteristics. We compare the performance of the proposed method, the maximum likelihood estimator, and the conventional Fisher scoring as well as the cyclic Cramér-Rao lower bound. Simulation results show that the proposed periodic Fisher’s scoring estimator has a lower mean cyclic error and lower computational complexity than the classical method, and it is more stable around the edges of the range [-pi,pi]. Moreover, under a misspecified model, where the noise is assumed to be white (i.e. uncorrelated between the sensors), while it is colored (i.e. correlated), our methods significantly outperforms the existing methods.
Accurate timing is both an essential requirement and a perennial problem for seismology. With the adoption of satellite based global positioning systems for the timestamping of data in the 1990s, this problem appeared to have been solved. I have applied multichannel cross-correlation (VanDecar & Crosson, 1990) to recordings of teleseisms across seismic arrays, to estimate the relative timing error between array elements. At the Yellowknife seismological array, timing errors are observed at some array elements, up to a maximum of 0.4 s, while others exhibit no apparent timing errors. The relative timing errors between 2014 and 2020 are quantized, with a base value of approximately 0.12 s. The time period can be divided into two eras; within each era the timing error only increases. The cause of the timing errors is unknown, but the quantization is inconsistent with what is expected for a free running clock, and no timing errors have been observed since a digitizer firmware upgrade on 8 April 2020. A model is developed for the absolute timing errors, one which can be applied to historical data. It is recommended that timing error checks be done at other International Monitoring System arrays.
Underground nuclear explosions induce a strong flow of air through the surrounding fractured porous media, which carries radionuclides or chemical species. While radioxenon and 37Ar represent tracers scrutinized by the International Monitoring System and On-Site Inspection, respectively, a larger variety of tracers is emitted at the ground surface: radioxenon and heat generated by the explosion, radon and carbon dioxide naturally occurring underground and in the soil cover, respectively. The objective is to determine the conditions under which these additional tracers would help to better discriminate the origin of air masses and reduce uncertainty regarding the origin of radioxenon in cases of high background signals.
This objective can be achieved by a recent code, which solves flow, tracer transport and thermal effects through a fractured porous medium on the Darcy scale (Pazdniakou et al., Pure Appl. Geophys., https://doi.org/10.1007/s00024-022-03038-4, 2022).
The influence of fractures and of their volumetric density is shown to be crucial on the evolution and the distribution of the tracers at the surface. The natural atmospheric fluctuations play an important role on the instantaneous tracer releases. A soil cover smooths out the ground distribution. Finally, all the previous exhalations appear to be sufficiently large to be measurable.
Molten Salt Reactors (MSRs) are a Generation IV nuclear reactor design that is currently under development and testing in various countries around the world. The molten fuel provides an opportunity for continuous processing of gaseous fission products which may impact the International Monitoring System (IMS). Simulations were performed for four MSR designs to predict the production of IMS-relevant radionuclides during batch and continuous reprocessing schemes. Radioxenon and radioiodine signatures were drawn from these simulations and compared to current reactor designs (BRW, PWR, RBMK). For the case of continuous reprocessing of the fuel salt, the radioxenon and radioiodine signatures were found to be indistinguishable from a nuclear explosion.
We will review data from Phase II testing of Xenon International at RN33. Xenon International is a new generation radioxenon monitoring system developed by PNNL with a short sampling time of 6 hours. Phase II testing of Xenon International was conducted from July 2021 to April 2022 at International Monitoring System (IMS) radionuclide monitoring station RN33 on Mount Schauinsland, Germany. Activity concentrations of spiked and selected environmental samples were verified by reanalysis in either one of the IMS laboratories or the BfS noble gas laboratory in Freiburg. The activity concentrations measured by Xenon International are consistent with data from the current operational IMS system SPALAX at RN33, with sensitivities of Xenon International up to one order of magnitude higher for Xe-131m, Xe-133m and Xe-135. We will investigate multiple isotope detections and unusual single detections and explore the benefits of a 6 hour time resolution taking into account new ATM backwards calculation with hourly resolution.
The National Data Center Preparedness Exercise 2019 (NPE- 2019) provides an opportunity to evaluate the ability of the National Data Center (NDC) to use available International Monitoring System (IMS) data, techniques, and tools to verify compliance with the Comprehensive Nuclear-Test-Ban Treaty. In NPE-2019, there are unusual detections of radioactive particulates (Cs-134, Cs-137, La-140, and Ba-140) and noble gases (Xe-133, Xe-133m, and Xe-135) at some IMS stations. This NPE can be solved by data fusion between solutions of seismic and infrasound detections and the forward atmospheric transport modelling. However, the current work will illustrate the ability to use adjoint atmospheric transport model outputs, source-receptor sensitivity (SRS) fields from radionuclide IMS stations and the corresponding concentrations values of those multidetections to confine the source region and to estimate the source term.
This presentation will highlight background measurements of Ar-37 samples that were conducted at in Knoxville, Tennessee, USA to better understand the sources of atmospheric concentrations of Ar-37 using the Argon-37 Field System. The PNNL designed and built Argon-37 Field System has now processed and measured several hundred Ar-37 samples from both the soil gas and from the atmosphere. The system was designed to process whole air samples from soil gas, the atmosphere and from the output of radioxenon systems to detect Ar-37 in an above ground portable system. During this campaign, samples were collected at location, near Xenon International, and sent back to PNNL for processing and measurements as opposed to sending the system to a location and operating it remotely. Correlation with radioxenon measurements from the same air mass were conducted and will be discussed.
The need for common understanding and adoption of good practices and measurable ways to ensure safety and security in the development and deployment of different technologies are some of the "Raisons d'être" of standardization and certification. This presentation aims to create awareness about standardization, some areas of relevance for the CTBT, while encouraging dialogue with regards to standardization and certification gaps and needs.
The purpose of the panel on “Introduction of the synergies between professional societies and the CTBTO” is to introduce the audience to the professional societies that are represented on the podium and to inform about their CTBT-related activities.
There is no discussion expected but Q&A will be permitted.
CTBTO is hiring professionals in STEM fields – Learn about CTBTO’s Recruitment Process
Come and find out more about how to join the CTBTO! Visit us at the HR booth and speak with the CTBTO Human Resources team and learn more about CTBTO careers. You can also join us in our HR presentation where we will walk you through our recruitment process. CTBTO is hiring top talent across a wide variety of scientific and technical fields in seismic, radionuclide, hydroacoustic and infrasound technologies. There will also be a presentation focusing on students and young professionals interested in pursuing a career in these fields. We look forward to meeting you soon!
7-8 video rooms in parallel for topics: P1.2 (1 or 2 video rooms), P1.3, P1.4, P3.1, P3.4, P4.5, P5.2
We apply a current state-of-the-art machine learning based denoising algorithm on the seismological and hydroacoustic waveform records of the selected Democratic People's Republic of Korea nuclear tests. We use the DeepDenoiser algorithm to reduce the noise present in the waveform records of the larger Democratic People's Republic of Korea nuclear tests. The denoising of waveform records using machine learning has obvious advantages on the picking of phases and signal detection but the question is if the currently available techniques can be used beyond that. We investigate the impact the denoising has on the source mechanism inferences by comparing the seismic moment tensor inversion results of original and denoised data. Because of the good signal to noise ratio and as the source type is well known we can in this case establish if the denoised waveforms can be used for further source analysis. We find that care needs to be taken using the modified waveform data but also find promising results hinting at possible further use of the technique in the future for standard analyses. We further investigate if the application of the chosen denoising algorithm allows for the better resolution of the seismic moment tensor of the smaller Democratic People's Republic of Korea nuclear tests.
When monitoring for potential underground nuclear tests, distinguishing shallow earthquakes from explosive sources can often be achieved using the ratio of the body-wave magnitude to the surface-wave magnitude (mb:Ms), as explosive sources often produce less energetic surface wave excitations than earthquakes with the same mb. Current methods for surface-wave detection at the International Data Centre (IDC) rely on a dispersion test. A global group-speed model is used to predict a time window based on origins in the IDC Reviewed Event Bulletin (REB). The waveforms in the predicted time window are filtered into eight frequency bands - if the time of the maximum energy of at least six of these bands sits within a specified error of the expected dispersion curves, a surface wave is detected. The current version of this algorithm was implemented into provisional operations at the IDC in 2010 (Maxpmf). We have designed interactive software to review the IDC automatic surface-wave detection algorithm. We investigate mis-associated surface wave arrivals in the REB, caused by a processing limit of surface wave detection to 100 degrees distance. Examples of surface waves originating from greater than 100 degrees distance and being associated with a closer event are presented.
For the purpose of monitoring for compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), the International Monitoring System (IMS) includes 80 sites with particulate radionuclide samplers, 40 of which also have a noble gas sensor system. The coincidence of radioiodine and radioxenon observations at the co-located systems may offer an opportunity for event screening. This study gains knowledge of typical radioxenon to radioiodine ratios of releases from nuclear power plants (NPPs) and compares these ratios with the signatures that may indicate a nuclear explosion. The study presented here builds on the previous publication about the radioxenon emission inventory from NPPs for the calendar year 2014. The radioiodine emissions of the same reactors will be retrieved, and the distribution of the ratios established for the atomic masses of 131, 133 and 135 as well as for the most frequently observed isotopes of these two elements, namely Xe-133/131I. The purpose of this presentation is to investigate whether these radioxenon to radioiodine ratios can be used for screening methods and to enhance understanding of the impact of known sources on the IMS background observations.
Xenon and iodine isotopes are, among other CTBT-relevant radionuclides, major indicators for nuclear explosions. They are therefore globally monitored by the International Monitoring System to verify compliance with the CTBT, using different technologies. Xenon isotopes are intermediate decay products of radioiodine. If originating from the same source, the radioxenon to radioiodine ratio of isotopes from the same mass chain may be suitable for additional screening of events. This study investigates the radioxenon to radioiodine isotopic activity concentration ratios, as they occur in samples of co-located noble gas and particulate systems that significantly overlap in sampling time. These IMS observations are compared to ratios of the same isotopes for different sources that may be observed. These are simulations of nuclear explosions and nuclear facilities (NPPs and MIPFs), as well as empirical data from published reports for both historic nuclear tests and for releases from nuclear facilities. Based on this comparison, conclusions can be drawn on the usefulness of radioxenon to radioiodine ratios for event screening in CTBT monitoring.
The US National Data Centre (NDC) observed five relatively large seismic events in the Chhattisgarh Province of India from September 2018 through October 2022. These events are of interest as they exhibit some explosive characteristics, and their respective yields are approximately equivalent to a 1 kiloton underground nuclear explosion. The events occurred near a known coal mine that uses retreat style mining, which has been known to cause catastrophic collapses at other mining areas. The US NDC conducted Interferometric Synthetic Aperture Radar (InSAR) analysis in concert with waveform correlation and cluster analysis to show that these events can likely be attributed to randomly occurring collapses at the mine. Further, the radius of the InSAR observed surface deformation was compared to the theoretical cavity/chimney radius of an equivalent size underground nuclear test. This comparison excludes the possibility of the events being nuclear related.
Impulsive, low frequency, electromagnetic signals were observed during historic United States underground nuclear tests. The source of these signals is uncertain though a prime candidate is the so-called “magnetic bubble”. This mechanism creates a magnetic signature when the hot plasma shell from the explosion expands in the Earth’s magnetic field. At low frequencies, this signal can diffuse to the surface where it can be detected. We have conducted experiments with an underground synthetic source at the Nevada National Security Site to emulate possible magnetic bubble signals and to investigate propagation through the intervening overburden. The synthetic source has a peak magnetic dipole moment of 100 kA-m2. The signals from the source can be detected at ranges of several hundred of meters through the saturated tuff overburden using commercially available induction magnetometers and capacitive electric field sensors. The principal noise sources in our experiments have been anthropogenic noise (utility 60 Hz and harmonics) and impulsive atmospheric noise.
Detection of radionuclides released from a nuclear explosion is an essential task performed by the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) as mandated under the Comprehensive Nuclear-Test-Ban Treaty. Identifying possible source regions for relevant radionuclide observations and identifying potential stations for measuring releases from known source locations is done by atmospheric transport modelling (ATM). The CTBTO currently investigates the potential benefits of using high resolution ATM (HRATM). Past announced underground nuclear tests at the Punggye-ri Nuclear Test Site from the Democratic People’s Republic of Korea are used as case studies to scale CTBTO’s capability to identify sites of the International Monitoring System (IMS) that might detect a hypothetical release. These events are also used to identify the capability to locate Punggye-ri as the possible source location. The current study evaluates the performance of CTBTO’s HRATM approach compared to previous results. Variations in spatial resolution of meteorological input data (0.5° to 0.01°), meteorological models European Centre for Medium-Range Weather Forecasts
(ECMWF) and Weather Research and Forecasting Model (WRF), ATM models (Flexpart and Flexpart-WRF) and physical parameterization demonstrate the sensitivity to configurations. Evaluating the potential increase in accuracy by using metrics from previous ATM challenges shows what enhancements can be acquired with HRATM and what configuration works best.
The framework developed and presented here provides a new and transportable method to determine the yields of seismically recorded underground nuclear explosions. The key advantage of this method over other methods which estimate absolute explosive yields is that this method does not require any a priori calibration and can be immediately applied to any region of interest. This method uses the source information obtained from the spectral ratios of envelopes of measured seismic coda waves to simultaneously invert for the source parameters describing a set of seismic events including both explosions and earthquakes. The only requirement of this method is that the source region must contain several (>3) seismic sources which have been well recorded (SNR > 2) by a set of shared stations. For regions with only a few events and/or extremely band-limited observations, for which the depths of burial are unknown, an earthquake with a known magnitude may be required to obtain accurate and robust yield estimates. We apply this method to the six declared Democratic People’s Republic of Korea nuclear tests and report new independent absolute yield and depth of burial estimates which are commensurate with previously determined source parameters.
The International Data Center (IDC) at the Comprehensive Nuclear-Test-Ban Treaty Organization routinely analyses data from radionuclide measurement stations of the International Monitoring System (IMS) in support of its mandate. The Standard Screened Radionuclide Bulletin (SSREB) is a product of the IDC that is generated for each radionuclide sample having measured concentrations of CTBT-relevant radionuclides above abnormal thresholds. In addition, States Parties can request an expert technical analysis (ETA) in which IDC experts perform an in-depth analysis of IMS data with the possibility of including additional data to produce the State Requested Methods Report (SRMR). To facilitate radionuclide ETA, automatic procedures have been developed that assist experts in sample selection and event association based on the consistency of isotopic ratio evolution utilizing measurements from multiple radionuclide samples. The implementation of automatic procedures enables experts to gain an improved understanding of release events and assists experts to more quickly produce the SRMR. Furthermore, automatic processing can identify sample sets which are used to validate new methodologies. In this presentation we will describe the algorithms developed for automatic procedures and show preliminary results. Building from these developments, we plan to improve the automatic processing algorithms to contribute to other IDC products such as the SSREB.
Four technologies are used for compliance verification of the comprehensive Nuclear-Test Ban Treaty (CTBT), hydroacoustic, infrasound, seismic and radionuclide monitoring. Forty radionuclide stations will have the radioxenon monitoring capability. For the last few years, a lot of research have been done in order to discriminate radioxenon derived from an explosion and anthropogenic sources. The explosion data are rare, for quite few nuclear test were done and little radioxenon was released from the underground. The civil facilities also emit mounts of radioxenon and the stations near those facilities will have a high radioxenon background. It is difficult to discriminate the background events and background plus explosion events. In this work, radionxenon monitoring data sets were modelled based on International Monitoring System xenon monitoring data over the past decade and the xenon ratios emitted by nuclear explosion. Dozens of features, including the four radionxenon concentrations, flags on xenon detected, xenon ratios, were analysed. We used CatBoost algorithm to establish a two classification model.
Data from the International Monitoring System (IMS) stations of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) are being processed by automatic processing, Global Association, interactively analysed and reviewed, resulting in International Data Centre (IDC) bulletins. The Network Processing Vertically Integrated Seismic Analysis (NET-VISA) is a Bayesian seismic monitoring system designed to process IMS data to reduce the number of missed and false events in the automatic processing. NET-VISA has been implemented in operation as an additional event scanner in operation since January 2018. In this study we assess the effect of NET-VISA automatic scanner on the number of events in the IDC bulletins, Late Event Bulletin (LEB) and Reviewed Event Bulletin (REB). In particular, the impact of NET-VISA scanner on the number of scanned events during the interactive analysis is assessed. We use three distinct time periods to evaluate the NET-VISA performance as well as the effect of the other possible factors, such as global seismicity and network performance. The results show a 4.6% increase in the number of LEB events after including the NET-VISA scanner in operation, with an average of seven events per day and a notable increase of 17.90% in the number of scanned events.
We determined root mean square (RMS) amplitudes of Lg waves from the six known underground nuclear tests conducted by the Democratic People’s Republic of Korea. We analysed waveform data from a dozen seismographic stations situated in the Republic of Korea, the People's Republic of China and the Russian Federation operated by the International Monitoring System, Global Seismic Network, and the Korea Meteorological Administration. The RMS Lg amplitude measurements indicate consistency between the stations situated in the continental crust. The measured RMS Lg amplitude at pairs of stations shows stability with an interstation standard deviation of as low as 0.03 magnitude units for the six explosions. The RMS Lg amplitude measurements on vertical records are well correlated to the teleseismic body wave magnitude mb of the six explosions with a very small standard deviation ranging from 0.03 to 0.08 magnitude units. The stability of RMS Lg amplitudes suggests that it can be used to estimate the yield of the Democratic People’s Republic of Korea's nuclear tests.
Isotopic ratios of radioxenons can be used to verify the occurrence of an underground nuclear explosion (UNE). Simplified analytical models and closed-form solutions using Bateman equations simulate a idealized radioactive decay/ingrowth chain in a closed and well mixed system. The partitioning of the radionuclide inventory between a gas phase and rock melt created by the detonation and the gas transport from the cavity to host rock or ground surface are not addressed properly. Either subsurface transport or prompt release that is principally responsible for gas signatures are inconsistent with the simple closed-system assumption. In this study, a realistic model about post-detonation cavity processes were developed. A closed-form solution representing time dependent source term activities is extended by considering the cavity partitioning process, slow seepage, and/or prompt release of gases from the cavity and applied to realistic systems, influencing the evolution of isotopic ratios over the course of UNE histories. A library of radioxenon composition in the cavity and host rock can be simulated with different parameters, which is used for event discrimination and further for estimation of the detonation time with respect to noble gas measurements at International Monitoring System stations. It can also be used for the event discrimination based on machine learning.
Radionuclide stations in the International Monitoring System (IMS) network routinely collect air samples and assess activity concentrations. Activities collected in samples are often caused by emissions from nuclear facilities, but they could also indicate a noble gas release from an underground nuclear explosion. A discrimination can be done by estimating and analysing activity ratios of CTBT-relevant radioxenon isotopes under assumed scenarios. One of the issues in the isotopic ratio estimation is whether the contribution of the radioxenon background at IMS stations needs to be subtracted. This work will investigate the impact of the radioxenon background subtraction on the discrimination of a nuclear release event. Simulations are performed with atmospheric transport modelling to determine the concentrations originating from hypothetical radioxenon releases of pre-defined underground nuclear explosions distributed over a global semi-regular grid at different times of the day. The latter are studied independently and in the form of synthetic concentrations on top of real observations to account for the radioxenon background. The ratios of detected radioxenon isotopes are compared between the real IMS observations (typical radioxenon background from 2014), simulated concentrations from hypothetical nuclear explosion sources (pure signals without radioxenon background) and synthetic ones.
On 7 September 2017, the Democratic People’s Republic Korea carried out its latest and largest nuclear test (DPRK6). Recent efforts have been made to increase understanding of this and previous nuclear tests using seismic waveform, and synthetic aperture radar (SAR) geodetic deformation data (e.g. Wei 2017, Chiang et al. 2018, Myers et al. 2018, Wang et al. 2018). In our previous work (Chi-Durán et al. 2021) we performed a joint regional waveform, first-motion polarity, and surface displacement inversion demonstrating improved discrimination of the source type of the event. In this work, we have applied the aforementioned joint inversion with a layered velocity model based on satellite observations of discontinuities and inferred lithology (e.g. Pabian and Coblentz, 2015) and published ranges of seismic velocity for the inferred lithology to DPRK6 and an earlier event (DPRK4, 2016/01). We find that the consideration of the layered velocity model improves the recovery of source depth, and in both cases the joint inversion is found to provide better discrimination of the source type and better constrain the scalar seismic moment needed for downstream yield estimation.
Since 2008, the CTBTO has been conducting temporary radioxenon measurement campaigns using transportable systems. These campaigns are focused on improving the performance of the verification system as described in the Treaty. In 2017, the Government of Japan made a voluntary contribution to boost CTBTO capabilities to detect nuclear explosions. In early 2018, two transportable noble gas systems were deployed in Japan where they form, together with the International Monitoring System (IMS) system JPX38, a high density network. As of today a few thousand samples have been measured by the two transportable systems. Measurement spectra are automatically sent to the International Data Centre, processed in a non-operational database and reviewed with focus on the four CTBT-relevant xenon isotopes. Reviewed data are used to improve methods and methodologies for understanding the radioxenon background that are applicable to any IMS noble gas system. Possible sources regions (PSR) and evolution consistency of isotopic ratios in samples collected by the systems forming the high density network were estimated first, and the impact on isotopic ratio consistency and on both standard and consistency specific PSRs was investigated. Further applications include isotopic ratio screening, Xe-135 observations, and enhancements of concentration estimates from known sources.
The main purpose of this study consists in applying discrimination methods to distinguish known nuclear tests conducted by the Democratic People’s Republic of Korea from surrounding natural seismicity, based on various seismic signal processing algorithms. To characterize the Democratic People’s Republic of Korea's seismic activity between 2006 and 2022, spectral analysis, waveforms cross-correlation techniques and amplitude ratios of Pg/Lg and Pn/Sn waves computed in different frequency bands were applied. For the selected events, we analysed waveforms recorded by seismic stations with epicentral distances going up to 70 degrees. The Democratic People’s Republic of Korea's nuclear tests were relocated using relative algorithms to validate and calibrate these methods. Additionally, we investigated continuous recordings of nearby stations (epicentral distances below three degrees) using waveforms cross-correlation techniques to identify possible microearthquakes induced by the the Democratic People’s Republic of Korea's nuclear tests. The proposed discriminants successfully separate tectonic events from nuclear tests, but its success rate is heavily influenced by the epicentral distance. The relocations of the Democratic People’s Republic of Korea's nuclear tests showed differences of up to 5 km compared to the results of previous studies, highlighting a good accuracy of the proposed methods. These findings represent the first step in the development of future artificial intelligence algorithms.
The Rock Valley Direct Comparison (RV/DC) project is the third phase of the Source Physics Experiment (SPE). Under RV/DC, two chemical explosions will be detonated near the hypocenters of a sequence of anomalously shallow earthquakes on the Nevada National Security Site. The explosions (nominally 1,000 and 10,000 kg TNT-equivalent) will be recorded by an extensive array of multi-physics sensors. This will be the first direct comparison of earthquake and explosion signatures originating from nearly identical hypocenters. The direct comparison will enable researchers to exploit the physical differences between explosion and earthquake sources sharing the same propagation path and recording sensors.
Here, we present the multi-physics sensor network we plan to deploy, which includes seismic, infrasound, distributed fiber optic and GPS stations. These instruments will be deployed on the Earth’s surface, in deep boreholes, and on aerial balloon platforms. A full distribution of propagation ranges will be covered, from near-source (tens of meters), to regional (hundreds of kilometers) distances. We anticipate these signals will provide key metrics in next generation monitoring systems.
On the Semipalatinsk Test Site territory there are several mineral deposits at some of which the active mining is observed. The most blasts are conducted at Karazhyra coal quarry. Three IMS stations, Makanchi (PS23), Borovoye (AS057), Kurchatov-Cross (AS058), are recording these quarry blasts. The quarry-to-stations distance is 452 km, 668 km, and 68 km respectively. For 19 years of observation, about 2800 quarry blasts were recorded. The analysis of the routine processing results from the automated and interactive bulletin of Kazakhstan NDC showed that the field of obtained epicenters exceeds the quarry size. To understand the reasons of such scatter, the waveforms were processed in detail. For the processing and analysis of the waveforms, Geotool and DTK – GPMCC software were used. It was revealed that the azimuths values of different regional phases Pn, Pg, Sn, Lg differ from each other and have different dispersion. There are systematic deviations of azimuths for some phases. The clear dependence of the epicenter accurate measurement on energy (yield) of blasts was noted. The human factors influencing on the estimations accuracy were also found. The recommendations on processing were given to the KNDC analysts to improve location accuracy and discriminations of seismic events.
We introduce the Central and Eastern European Infrasound Network (CEEIN) established in 2018 as a collaboration between the Zentralanstalt für Meteorologie and Geodynamik, Vienna, Austria; the Institute of Atmospheric Physics of the Czech Academy of Sciences, Prague, Czech Republic; the Research Centre for Astronomy and Earth Sciences of the Eötvös Loránd Research Network, Budapest, Hungary; the National Institute for Earth Physics, Magurele, Romania and the Main Centre of Special Monitoring National Center for Control and Testing of Space Facilities, State Agency of Ukraine. Waveform data of the CEEIN stations are archived at the NIEP EIDA node, and can be downloaded from www.ceein.eu.
We demonstrate that CEEIN improves infrasound event detection capabilities in Southern and Eastern Europe, and show that adding infrasound observations to seismic data in the location algorithm improves location accuracy. We identify coherent noise sources observed at CEEIN stations. We present the biannual CEEIN bulletin of infrasound and seismo-acoustic events, our contribution to the European infrasound catalogue. Many of the events in the CEEIN bulletin are ground truth events that can be used in the validation of atmospheric models.
The National Data Center of Kyrgyzstan daily monitors seismic events of various nature, most of which are tectonic earthquakes and quarry blasts. Sometimes seismic stations in Kyrgyzstan register unusual phenomena associated with exogenous geological processes, such as landslides, snow avalanches, rockfalls and mudflows. Landslides and snow avalanches can be caused by both tectonic processes and geological, geomorphological and hydrogeological conditions, climate change, as well as the impact of a complex of anthropogenic factors. According to the records of Kyrgyzstan’s seismic networks the features of the waveform of a powerful landslide on November 30, 2019 ~23-43 GMT in the area of the Kumtor gold deposit were studied, the volume of which was 12.825 million cubic meters. The waveforms of another landslide on September 14, 2020 in the area of the Kara-Keche coal deposit with a volume of ~1 million cubic meters were studied. As well as seismic records of a glacier retreat with a volume of ~2 million cubic meters in the area of the Juuku gorge on July 8, 2022 ~08-44. It is shown that landslides in the areas of the Kara-Keche and Kumtor deposits are caused by anthropogenic activity, and the glacier collapse is caused by climate change.
Over the past ten years of observations by the seismic monitoring network of the National Nuclear Center of the Republic of Kazakhstan, seismic events of a different nature have been registered on the territory of central Kazakhstan. The largest number of these events are quarry explosions at deposits of solid minerals. But there are also tectonic earthquakes and natural-technogenic events induced by prolonged explosive activity in the field. Despite the fact that earthquakes are rarely recorded in such ‘weakly seismic’ areas, quite strong earthquakes can occur there, causing damage to the existing infrastructure near their sources. The paper presents materials on the study of the nature of seismic events and their interrelations with the deep structure of the environment, the tectonics of Central Kazakhstan and the distribution of anomalies of geophysical fields.
In recent years there were several tragic accidents at military depots in the south of Kazakhstan (Arys, Taraz) and in Uzbekistan (Malik) resulting in destruction of buildings, injuries and loss. All these bursts were recorded by Kazakhstan seismic stations. The closest station to the epicenters of all bursts was Karatau seismic array, part of the National Nuclear Center monitoring system of the Republic of Kazakhstan. Using the data of this station, the explosions were located, energy parameters were calculated and the waveforms were analysed in detail. Each of the three cases had a different “scenario” of explosion sequence. In Arys, there were more than 30 bursts with energy class (K) ranging from 4.8 to 7.8. In Taraz there were 17 bursts, with K value ranging from 4.3 to 8.1. In Uzbekistan there was one explosion with K=7.8. The wave pattern and spectral density were analysed using the f-k analysis, DTK – GPMCC, Geotool. The knowledge of the exact epicenter location of the bursts allowed them to be considered as ground-truth events. The location accuracy of different bursts using the data of one seismic array was estimated. The regional travel-time curve of seismic waves for south Kazakhstan was specified.
Two seismoacoustic events occurred on 26 September 2022 close to Bornholm Island in the Baltic Sea. Both events were listed in the Reviewed Event Bulletin of CTBTO as they were detected with the International Monitoring System. As the events were classified as a critical event by the Austrian National Data Centre, the available data were analysed in detail. The earlier event, with a published body wave magnitude mb 3.1, at 00:03 UTC was recorded at three seismic stations and at the infrasound station IS26. The later event at 17:03 UTC with a body wave magnitude mb 3.2 was even recorded at seven seismic stations and at the two infrasound stations IS26 and IS43. Furthermore, the later event was detected at the Austrian infrasound station ISCO.
A first assessment of data, which were analysed by using the DTK-GPMCC, confirmed the explosive nature of the events due to extraordinary infrasonic detections. Further analysis was carried out using waveform analysis tools available within NDC in a box (GeotoolQt and Geotool). Seismic data of the events from 26 September 2022 were compared with data of a natural seismic event which occurred in Croatia on 1 October 2021 and the explosive character of the events could be confirmed.
Previous works have demonstrated potential in using infrasound data to constrain earthquake source properties (Hernandez 2016; Shani-Kadmiel 2018, 2021; Averbuch 2020). Typically, the applied approaches are based on comparing modelled and recorded signals. In the current study we seek to constrain the source depth and moment-tensor. We use infrasound data from local and seismic data from local and regional stations. We use the data from the Kiruna 2020 mine quake of which the local and regional recorded infrasound signatures have been presented at previous venues. This event is one of the largest Scandinavian mining-induced earthquake. It produced signals recorded by three infrasound arrays at distances of 7 km (KIR), 155 km (IS37) and 286 km (ARCI). Our recent studies show: (1) Full moment-tensor estimated from the seismic data, and source-type analysis shows that this event has collapse features. (2) Data simulation comparison of seismoacoustic data for various possible source depths using SPECFEM-DG concludes that the local infrasound data helps estimating source depths. (3) Comparison of simulated infrasound signals using full moment tensor solution estimated in this study and the full moment tensor solution from GCMT showed that our full moment-tensor solution produces infrasound signals having a better agreement with the observations.
We introduce infrasound data products of all certified International Monitoring System infrasound stations for scientific studies and applications. We have reprocessed the IMS infrasound waveform data of the last 20 years using the Progressive Multi-Channel Correlation (PMCC) method, configured with one-third-octave frequency bands between 0.01 and 4 Hz. From the comprehensive detection lists we derived four products for each of the 53 stations. These cover different frequency ranges and temporal resolutions, and thus different sources. The low-frequency product (0.02–0.07 Hz, 30 min) primarily covers mountain-associated waves. The second product mainly reflects the spectral peak of microbaroms (0.15–0.35 Hz, 15 min). Higher frequencies of microbaroms and other sources are summarized in the third product (0.45–0.65 Hz, 15 min). Observations with centre frequencies of between 1 and 3 Hz (5 min) are part of the high frequency product. Our intention for these data products is to facilitate using this unique global infrasound dataset for scientific applications. The products open up the IMS observations to user groups who do not have access to IMS data or are unfamiliar with data processing using the PMCC method. We demonstrate the data products based on recent and global atmospheric infrasound sources, such as volcanic eruptions and ocean ambient noise.
Starting with 2009, three infrasound stations have been deployed on the Romanian territory by the National Institute for Earth Physics: (1) IPLOR (in central Romania), (2) BURARI (in northern Romania) – under cooperation with AFTAC (USA), and (3) I67RO temporary PTS portable array (in western Romania) as a two-year experiment (2016-2018), within a collaboration project with the Provisional Technical Secretariat. Data recorded with these stations are continuously processed and analysed at theRomanian National Data Centre by running a duo of infrasound detection-oriented software – DTK-GPMCC and DTK-DIVA – packaged into NDC in a box. A significant set of detected infrasound signals could be associated to ground truth events located by the International Data Centre (REB, LEB) and seismological centres (EMSC, ISC). Further details were obtained from the fireball database published by CNEOS/JPL. The main types of ground truth sources observed with Romanian infrasound arrays are bolides, earthquakes, chemical/accidental/military explosions, volcanic eruptions, and anthropogenic activity (mining, sonic booms). Collected information summarizes the ground truth source, Romanian infrasound stations detecting the event, predicted infrasound arrivals, associated infrasound detections, and detection plots related to the event, and is accessible via HTML interface. The geographical distribution of the ground truth events can be plotted with Google Earth software using a KML file containing the location of the events.
International Monitoring System H phase stations have great potential for submarine earthquake monitoring in the southern hemisphere. The hydrophones could record low frequency sounds from extensive areas. However, some seismic signals from different areas would have similar characteristics in a short time interval. It is difficult to locate the epicenter using only the azimuth and arrival time from the hydrophone triplet. Therefore a submarine seismic location model was built based on the time delay of the P-wave and T-wave at a hydrophone to obtain the sound propagation distance. Arrival times were extracted based on the Akaike information criterion. Combined with the back azimuth, the event time and location could be acquired via a single triplet. Furthermore, the proposed method is applicable to the identification of hydrophone signals under the informed seismic bulletin. The correspondence between signals and earthquakes contributes to the construction of AI algorithmic datasets for seismic signal detection. Over 940 submarine earthquakes (≥M 4.0) were located using the proposed method, and the locations were verified by using the USGS bulletin which were recorded from 2014 to 2020 by two hydrophones triplets of International Monitoring System hydroacoustic station HA3 located at the Juan Fernandez Islands.
In this study we aim at discriminating hydroacoustic sources by identifying on the one hand source typical signal features derived from event catalogues and by characterizing on the other hand waveform similarities using clustering analyses of raw hydroacoustic data. We make use of event bulletins derived from the application of the Progressive Multi Channel Correlation method on International Monitoring System hydrophone data and from Reviewed Event Bulletin (REB) entries with hydrophone stations involved. We apply a general source discrimination by quantifying signal parameters like frequency content, time duration, waveform shape and signal amplitude and categorize sources using previously identified signal properties from literature. We further compare the raw waveforms of International Monitoring System hydrophone data during time segments of event detections and apply clustering analyses to separate waveform families by their similarity to each other and to certain reference events. We also suggest to include additional information from propagation modelling, bathymetry and ocean conditions to explain the observed event signatures and to connect them to different hydroacoustic sources and source regions. We finally compare our results with other approaches on signal discrimination and classification and try to estimate the precision and sensitivity as well as general feasibility and usefulness of our method.
The International Data Center is required to conduct expert technical analysis (ETA) and special studies to improve event parameters and assist States Parties in identifying the source of specific events, according to the Protocol to the Comprehensive Nuclear-Test-Ban Treaty. Source mechanism and event depth are two parameters which may be crucial for the event discrimination task. To introduce them into the Provisional Technical Secretariat commissioning, we have conducted testing and tuning of the ParMT branch of the SHI-ETA suite. The ParMT software is a new tool for shallow seismic event depth determination and source parameters characterization. Source properties are estimated via grid search over moment tensors. We follow Tape and Tape (2015) to uniformly discretize the moment tensor space, then determine the optimal moment tensor, magnitude and depth by comparing observed seismograms with synthetic waveforms. The software solves for the moment tensor for major types of sources (ISO, DC, CLVD) and source-receiver distances, as well as for the observed body and surface waves. A powerful user interface is included providing the whole analyst work cycle, from picking arrivals and polarity to final report generation. For ETAs, International Monitoring System as well as non-International Monitoring System data can be jointly used in processing.
The National Data Centre of Côte d’Ivoire and the PTS installed a portable infrasound network (I68CI) from mid-January to December 2018 in the Comoé National Park located in north-east Côte d’Ivoire. This mobile station includes four sensors and use MB3d microbarometers with 50 Hz sampling frequency. The main objective of this one-year deployment is to better understand and characterize regional infrasound sources. Some regional and local infrasound sources are detected by both a mobile array (I68CI) and a fixed array (I17CI). Among these sources, we note Mbembele (Mali) and Tongon (Côte d’Ivoire) mine activities. Hydroelectric power dam (Buyo, Soubre and Taabo) activities are also detected in Côte d’Ivoire. The split characteristics of the tropical Mesoscale Convective System of 9 April 2018 in northern Ghana is also investigated using the mobile array (I68CI) measured data.
The impact of wave propagation effects on the performance of the P/S ratio local discriminant is being evaluated during the third phase of the Source Physics Experiment, the Rock Valley Direct Comparison (RV/DC), conducted at the Nevada National Security Site. During the experiment a chemical explosion will be detonated near the hypocenter of a shallow earthquake. The direct waveform comparison on a dense network of seismic sensors will enable the investigation of seismic source signatures and discrimination between explosion and earthquakes sharing the same propagation path. We used high-frequency (0-10Hz) ground motion simulations to emulate the RV/DC experiment in order to investigate the generation and propagation of seismic waves at local distances, and the performance of the P/S source discriminant. The numerical experiments were performed using high-performance computing and a local velocity model with correlated depth-dependent stochastic velocity and density perturbations, that are needed for simulating wave scattering on a frequency range of monitoring interest. We found that at local distances the P/S discriminant is strongly affected by the degradation of the radiation pattern of source generated P and S waves due to wave path effects in the shallow crust, and that network averaging improves the overall discriminant performance.
We investigate historical nuclear tests from the Lop Nor region, China, using waveform-based source inversions and three-dimensional waveform modeling. Despite sparse data distributions and low signal-to-noise ratios, we recover isotropic moment tensor solutions for historical explosions and obtain detailed uncertainty information from likelihood evaluations over magnitude, depth, and the entire space of moment tensor source type and orientation parameters. We identify a one-dimensional (1-D) Western China Earth model that provides improved waveform fits, reduced cycle skipping, and drastically improved uncertainty estimates compared with spherical-average Earth models. Preliminary 3-D forward modelling results show yet further promise for improving source constraints. By exploring the entire source parameter space, we account for tradeoffs that become important for the determination of source type, depth of burial, and yield in sparse monitoring scenarios. Subsequent likelihood-based uncertainty analysis shows the importance of accounting for differences in data variance between body waves, Rayleigh waves, and Love waves and allows for straightforward data fusion between waveform, travel time, and polarity measurements.
The International Monitoring System (IMS) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) comprises four verification technologies that work in synergy to enable the detection, location, and identification of potential nuclear explosions. One of the four technologies is the infrasound which listen to corresponding sound waves. As of 2022, Fifty-three from sixty planned IMS infrasound stations are recording corresponding pressure fluctuations. In this regard and in order to support the National Data Centres (IDC ) and infrasound analysts, the IDC produced on 9 July 2010 the Infrasound Reference Event Database (IRED) to collect, review and document infrasound events of special interest and to archive the data for each event into database tables to be used for training, testing and validation purposes. IRED contains 616 events grouped in 12 categories. In the last 12 years, infrasound stations detected more events of special interest which can be added to the database. New infrasound stations were certified after 2010 and recorded more interesting events. Moreover, some existing events in IRED need to be corrected. In this poster we will show that IRED needs to be revised and updated using existing events from the database and some new detected events.
Seismological bulletins from local and regional seismic stations have identified signals associated with mining explosions in the Orapa mining area, Botswana. Here we present the results of integrating detections from regional seismic networks and the International Monitoring System (IMS) network. Using the local bulletins and catalogs, we requested IMS data via the Extended NDC-in-A- Box (ENIAB) and retrieved local station data using the Obspy library. The combined analysis of seismic and infrasound signals gave additional information about the origin time and location of events as well as distinguishing mine explosions from natural events. Infrasound detections were generated using the ENIAB’s DTK-GPMCC whereas Geotool was used to analyse seismic data. Geotool was also used to create a final integrated bulletin. Combining the infrasound stations of IS47 (South Africa) and IS35 (Namibia) with the local seismic stations improved the location of these seismoacoustic events.
Hundreds of years has elapsed since earthquakes started to be instrumentally recorded. However, our understanding of magmatic rifting and other complicated source features that we observe today was limited due to the absence of digital seismic records at epicentral distances of interest. After the advent of digital broadband instrumentation with wide dynamic range, volcano seismology has become one of the strongest disciplines contributing to the study of magmatism. Over 20 volcano-tectonic earthquake sequences have occurred in Afar and the northern main Ethiopian rift for the last couple of decades and the corresponding seismic data are recorded by temporary and permanent broadband seismic network. This shows that there are several active volcanic and fissure eruptive centers in the area which need monitoring to study continental rift mechanisms and for mitigating volcano and earthquake hazards. Moderate size earthquakes from these active volcanoes can be used as a reasonable ground truth event so as to improve the regional travel times curves.
This work has the goal to present a useful method to help in the discrimination of natural events (tectonics) from artificial events (chemical explosions in quarries blasting) when both sources are co-located. This is a frequent problem where the seismicity is triggered by stress released due to material removal in mines. With that purpose, we are conducting experiments using local infrasound stations to monitor blasting in quarries, recording infrasonic signals from detonations and infrasonic signals generated by earthquakes, in case they are present. Our work is done considering two scenarios, one of them using one co-located seismic and infrasound stations installed where there are blasting in mines and natural events (up to 30 km distant). We have already managed to get some results with this configuration. The other scenario we are in our way to implement two infrasound arrays (triangular shape four elements each) and a seismic network where there are blasting in mines and natural events. This implementation will be done soon. Our initial findings is that the few induced earthquakes occurred at a mining open pit did not produce any detectable infrasonic signal, hence we managed to discriminate them from the mining blast occurred at the same spot.
The Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO) uses infrasound to measure acoustic impulses in the atmosphere that could be indicative of an atmospheric nuclear test or explosion. However, many other events analysed by analysts at the International Data Centre (IDC) and appear in IDC products such as the Reviewed Event Bulletin. Examples include gas pipeline explosions, rocket launches, volcanoes, meteors, and many others. In the past several years, two new infrasound stations were introduced to the network: IS20, in the Galápagos Islands (2018), and IS25, in Guadeloupe (2020). Here, we show some examples of the new perspectives that these stations give on events such as the recent activity from the Sangay volcano and regional rocket launches respectively. We also describe our experience with the recent (2022) upgrade in the analysis software to a newer version called DTK-GPMCC. By showing these salient events and giving some insight into the software used by analysts, we hope to present a picture of how infrasound analysis currently looks at the IDC.
On 26 September, 2022 the International Monitoring System (IMS) detected two seismic events near the Danish island of Bornholm. The events were seen in both seismic and infrasound data and were located near the positions where large gas leaks from the North Stream were later observed. The events were furthermore recorded by seismographs in the surrounding countries and analysed at the National Data Centres. The waveforms exhibited clear properties of underwater blasts with significant P energy and much smaller S- energy, as well as other characteristics not associated with natural earthquakes. Further analysis revealed that particularly the second event may consist of multiple blasts close in time and space. Analysis of the events included integration between IMS data and regional data from Denmark, Sweden, Germany and Norway, and served as an exercise in collaboration during an international crisis.
This study provides an overview of a civil application of International Monitoring System (IMS) data to assist in disaster mitigation. Two infrasound arrays coupled with seismic stations of the IMS recorded the with Baltic Sea gas explosion on 26 September 2022. Underwater eruptions generate air waves and pressure waves that can cause fluctuations in the different layers of the atmosphere. Analysis of the event was done using GPMCC to test the operational readiness of these IMS stations. The parameters studied were phase, frequency, magnitude, azimuth and slowness which were observed to be consistent with theoretic values. The combined interpretation of seismic and infrasound signals was used to obtain the location of the event. The study concludes that at local and regional distances the IMS network is operational ready in a timely manner to contribute data towards a safer environment.
Paraguay, located in the middle of South America, is a low seismicity country. Nevertheless it has been collecting data for a long time. The continuous operation of the station gives us the opportunity to select several years of bulletin. This time five years of International Data Centre (IDC) detection (from 2018 to 2022) were selected from bulletins (reviewed and automatics) that are used to breakdown the scope of the sensor detections and show the distribution of epicenters, which CPUP have been contributing to the IDC. The main purpose of this presentation is to illustrate in general how quantitatively the station has been used for global detection.
Starting in 2010, when the OVSICORI-UNA seismic network was upgraded with 24-bit digitizers and broadband instruments, we selected possible candidates to GT5.0 seismic events from the seismic catalog.
The GENLOC program was used to locate the events and the selection criteria were as follows: 1) minimum number of stations greater than or equal to 30, 2) Gap less than 110 degrees, and 3) minimum number of stations within 30 km greater than or equal to 1. In total, 707 events were selected.
The selected events are mostly in the central part of Costa Rica, with magnitudes between 1.9 and 6.0 Ml and depths ranging from 0 to 200 km.
Landslides along the Inter-American Highway in Central Costa Rica, between San Ramon and Esparza are common, mostly during rainy season and are more related to instability due to the state of alteration of the materials on the slopes than to local faults. The saturation of the soil and the slopes could lead to the acceleration of these landslide processes even with earthquakes of moderate magnitude. These earthquakes could occur both on the upper-crust faults, and along the subduction zone on the Pacific coast. Seismic local catalogue for the zone shows that earthquakes of low magnitude had been located nearby, being the earthquake of 20 December 2005, at 09:00:45, Ml 3.9 the largest. On 17 September at 4:10 p.m. Costa Rican time, the Costa Rica emergency system was informed of a bus accident in Los Chorros de Agua, Cambronero, Santiago de San Ramon, caused by a landslide, where several persons died. Although the landslide was small in dimension, it was detected by an infrasonic station, located on Poas Volcano ~36 km from the source and shows that infrasound station, as is this case, can be used to characterize small landslides, among other local infrasound sources.
Sandia National Laboratories is developing the Geophysical Monitoring System (GMS) for modernization of the United States National Data Center waveform processing system. The United States is providing the common architecture and processing components of GMS as a contribution in kind to accelerate progress on International Data Centre (IDC) re-engineering. Open source releases of GMS, available on GitHub, have been made annually since 2018. Recently the GMS development effort is focused on interactive analysis capabilities, referred to as IAN. The 2022 GMS open source release includes capabilities to access and display station metadata, waveforms, signal detections, and events (QC mask display is also in development). IAN uses modern web technology, accessed with a common web browser. All displays are fully synchronized with a consistent user experience. The 2022 release also includes a mature, operational quality station state of health (SOH) monitoring capability for CD-1.1 protocol stations, to enhance the ability of system operators to quickly recognize and address station availability and quality issues. GMS is deployed using a cloud-ready Kubernetes containerized platform, hardened for cyber security accreditation. This presentation describes the current GMS interactive analysis and station SOH monitoring capabilities.
The Observatorio San Calixto (OSC) is the National Data Center for the Plurinational Bolivian State. It manages the certified seismic stations PS06-LPAZ, AS08-SIV and one infrasound array IS08-BO. The distances from our center to the far away seismic station is about 1000Km (AS08-SIV) and the other two are located at Altiplano where the environment to work is often hard. In order to obtain the state of health (SoH) from these seismic stations we built a low cost board based on embedded technology (RaspberryPi) that allow us to monitor the batteries voltages, temperature at cave, intrusion, hard-disk/USB space and sometimes works as data buffer. The source code is powered by Python and the decencies for the input/output are taken from RaspberryPi cloud, furthermore, we were able to send this information through the Internet using 4G dongle from local telecommunication company. Data is sent to the OSC by a VPN, this information is then stored in a small database powered by MySQL, the information is displayed for the operators each three or four minutes. This method is also used in the national seismic network to help the station operator to anticipate and solve problems.
The Comprehensive Nuclear Test Ban Treaty (CTBT) prohibits nuclear explosions by anyone, anywhere: on the Earth's surface, in the atmosphere, underwater and underground by using four key technologies such as radionuclide monitoring. Niger deposited its instrument of ratification of the Comprehensive Nuclear Test Ban Treaty (CTBT) with the Secretary-General of the United Nations on 9 September 2002, bringing the total number of ratifications to 94. Niger is the 14th State Signatory in Africa to ratify the Treaty. Under the terms of the Treaty, Niger is home to two International Monitoring System (IMS) facilities, a primary seismic station PS26 in Torodi and a radionuclide station RN48 in Agadez which is being coupled by SPALAX NEX48 noble gas. The RN48 radionuclide station was installed in December 2018 and certified on 26 August 2019. This poster we will describe the step by step the process of the installation and test phase, the certification process and the operations and maintenance carried out by local experts.
The IDA/GSN seismic network consists of 40 broadband seismic stations around the globe. Fifteen stations serve as auxiliary seismic stations of the International Monitoring System (IMS). During the COVID-19 pandemic, travel was severely disrupted. However, after considerable adjustment, network data return remained good and operational costs decreased. This was due to several factors: 1) decreased international travel by US based field engineers, 2) increased reliance on local operators to perform more complex operations and maintenance tasks and 3) increased spare equipment on site. Repairs and sensor replacements were conducted by either local station operators, local contractors, or in-country seismic technicians. Advanced training of local operators was based on improved documentation, sometimes with on-site video streaming using cell phone based applications. Difficulties included a wide range of time zones for the US based engineers and cross-language communication combined with sometimes poor audio quality. These difficulties were addressed by developing more extensive step by step documentation and communication improved over time. Benefits consisted of less travel and associated costs by the US engineers and increased capabilities by station personnel. These benefits have continued even as COVID-19 related travel restrictions have been lifted. Recommendations for the future include increased remote training.
The Australian Radiation Protection and Nuclear Safety Agency (ARPANSA) maintains nine International Monitoring System (IMS) radionuclide stations across Australia, Antarctica, Fiji and Kiribati. A significant proportion of these are considered to be remote locations, where access is limited and local support is essential for the ongoing maintenance of the stations. This was particularly challenging throughout 2020, 2021 and 2022, where travel from Australia to many of the stations was not possible. Despite the challenges that are faced, the network under the responsibility of ARPANSA has consistently performed to an extremely high level of uptime and quality since beginning operation of stations in the year 2000. Whilst station performance data will be discussed, the focus of this presentation will be on several areas where ARPANSA has been able to optimize performance, including: core staff dedication and expertise, relationships with local operators and continuous improvement. It is intended that several real life case studies documenting the experience that has been built up over time will be presented to compare and contrast the requirements for successfully maintaining radionuclide stations to a high quality in varied environments.
One of the tasks of the Provisional Technical Secretariat (PTS), in regards to the mandate of the International Monitoring System (IMS) Division, is to support, monitor and conduct maintenance activities. In this presentation an overview of the field missions that the CTBTO maintenance radionuclide team have performed after the pandemic restrictions were lifted will be given. These activities aimed at the sustainment and reduction of downtime of the stations. These field missions included following types of activities: on-site corrective/preventive maintenance and training activities. When a problem could not be solved by the station operator or with PTS remote assistance, the PTS conducted a corrective maintenance action. Also, due to COVID-19 restrictions, some preventive maintenance activities or upgrades were postponed. These postponed activities were carried out on-site. Additionally, on-site training was delivered to station operators, increasing their knowledge on maintenance activities. In this poster presentation, we will discuss the activities of the maintenance team in the field and how this work increases the efficiency of the stations and operators, helping achieve a higher data availability.
In the context of the Comprehensive Nuclear Test Ban Treaty (CTBT), the CEA/DAM developed the Système de Prélèvement Automatique en Ligne avec l'Analyse du Xénon (SPALAX) about 20 years ago which is used in the International Monitoring System to detect xenon releases following a nuclear explosion. The New Generation of the system has been successfully developed by the CEA and CEGELEC Défense. CEGELEC commissioned several SPALAX New Generation at CEA premises. These were evaluated in France for several months before moving to operations. It is reliable and good results have been observed. New people have been trained on the system and can now operate it. CEGELEC will describe latest evolutions to the system following users lessons learned and provide feedback on the system operation.
Immediate identification of system component failures through state of health (SOH) trend analysis allows for fast resolution of International Monitoring System (IMS) station issues. The Radionuclide Operations Support System (ROSS) allows staff of the Provisional Technical Secretariat to display station SOH data, assess current station performance, and notify International Data Centre analysts of critical issues that affect IMS radionuclide station key performance indicators. This presentation will discuss examples of critical equipment failures as observed from SOH data of the various radionuclide technologies as identified using ROSS and some challenges observed in SOH data monitoring using the software.
E-poster session with display of each e-poster on an assigned touchscreen
The role of metrology in improving confidence in IMS measurements
Description: The purpose of this workshop event is to disseminate outcomes from the Infra-AUV project relating to the value of robust calibration practices in sensor networks and the associated practical aspects for maintaining sensor systems. The project will be producing a recommended good practice guide for establishing traceability through the on-site calibration of sensor systems as one of its outputs. The contents of this guide will be featured in this Workshop.
The spread of radiocesium traces is strongly influenced by the circulation of wind and ocean currents around Indonesia. Seawater from the Pacific Ocean that enters into the Indonesian marine water area could distribute Cs-137 from the Fukushima release. Monitoring results show that the concentration of Cs-137 on the west coast (West Sumatera, Bangka Island, North and South Java, and Madura) ranges from 0.12 to 0.66 Bq/m3. The effects of seasonal winds can affect the quantity of trace radiocesium around Indonesia, the influence of currents from the Pacific Ocean when entering the ITF can wash away radiocesium originating from the Pacific Ocean, especially in the eastern coast area (north and south Sulawesi, Lombok Strait, Flores Sea) ranges from 0.12 to 0.39 Bq/m3. The results obtained are similar, as well as at other points, where no radioactivity of Cs-134 was found at all. According to (IAEA, 2005), the ratios of Cs-134/Cs-137 can be used to identify anthropogenic radionuclide sources in the marine environment. The ratio was 0. Therefore, the results of our study indicate that the radiocesium input was from global fallout.
The Lawrence Livermore National Laboratory, Michigan State University, and NDCs in Central Asia (Kazakhstan, Kyrgyzstan and Tajikistan) digitized seismic bulletins of analog stations between 1951 and 1992. The metadata of all seismic stations of that period were also collected. The national network bulletin data were then combined with the International Seismological Centre bulletin to form the comprehensive bulletin for Central Asia between 1951 and 2017. The bulletin consists of ~400,000 events and will serve the basis of the new probabilistic seismic hazard analysis for the region.
We present the relocation results of the comprehensive Central Asia bulletin. We selected events with less than 355° secondary azimuthal gap, and using the MSU High Performance Computing facilities, we relocated ~250,000 events with iLoc, a single event location algorithm using regional seismic travel time predictions to improve locations. We developed a hierarchical review strategy to perform a thorough manual review with strict quality control. We show that each stage of the manual review brings incremental improvements. We also identified new Ground Truth and anthropogenic events in the region.
The results show significant improvements in the view of seismicity, providing reliable input for the PSHA and allowing for a more accurate determination of active seismic zone boundaries.
The National Data Center of Kyrgyzstan and Michigan State University (MSU) are digitizing historical analog seismograms of nuclear explosions that were conducted at the People’s Republic of China (PRC) Lop Nor test site. The methodology for digitizing and data quality control was developed by the MSU team. Of significant importance is the recovery of original station calibrations and amplitude-frequency response enabling the generation of dataless files. This allows data display in units of ground motion and the ability to conduct modern digital processing on the waveforms.
In total, 245 3-component seismograms of 26 atmospheric and underground nuclear explosions were digitized for the period 1966-1996, recorded by 35 stations at epicentral distances of 754-1542 km. For most digitized seismograms we are reliably able to recover data up to 5 Hz.
Based on the records of seismic stations of the Kyrgyz system of seismic monitoring, a comparative analysis of the wave pattern of underground nuclear explosions carried out at the Lop Nor test site and tectonic earthquakes that occurred in the same area was carried out. The spectral ratios of the amplitudes of P- and S-waves (e.g., Sn/Pn and Lg/Pg) were measured to search for effective criteria for seismic discrimination of underground nuclear explosions and earthquakes.
The “Waveforms From Nuclear Explosions (WFNE)” repository was developed and is maintained by Leidos under DTRA sponsorship. It was built as a trusted data set, starting from the previous data repository “Nuclear Explosion Database (NEDB)” that was accessed in the past by numerous users in the US and international nuclear explosion monitoring community. WFNE includes detailed information (origin, bulletin, other geophysical data) on all the 2157 atmospheric, underground and underwater nuclear explosions detonated in the world from 1945 to 2017. Over 70 000 waveforms associated to 677 of the nuclear explosions ranging from digitized analog recordings for the oldest explosions to recent IMS data are included in WFNE, and their station/instrument information, as collected from many sources. The web-based access and the presentation were updated and modernized and rendered ready for active user access. Users can search, visualize and download data of interest for their own research. Data continues to be collected from newly identified sources. Recent efforts on rescue of pre-digital seismic data via scanning and digitization provide interesting data to be added to WFNE, after completeness and quality checks. WFNE will be open for the research community’s access to source parameter data and associated waveforms from worldwide nuclear explosions.
The observation of atmospheric radionuclides that are associated with nuclear test explosions has a long history that started already in the 1940s. Over the decades of nuclear testing the methods were refined and mounted into the high-quality systems implemented in the International Monitoring System of the CTBTO. The project presented here is reviewing the publications about off-site monitoring of nuclear tests. In fact, many observations were done at large distances and often nuclear tests were detected at multiple locations, in general in the same hemisphere. Most publications are associated to tests that occurred in the atmosphere but observation of nuclear debris from venting of underground nuclear tests were also found. The frequency of radionuclides in remote atmospheric observations of historic nuclear test explosions is compared to several radionuclide lists considered for nuclear explosion monitoring to explore how these lists match the historic evidence. Suggestions are made how this data set is of value for validating and enhancing methods based on radionuclide analysis and related atmospheric transport simulations with the objective of identifying and characterizing the source of an event that is of relevance for atmospheric radioactivity monitoring for the Comprehensive Nuclear-Test-Ban Treaty.
During the years of nuclear tests, a network of seismic stations of the Institute of Seismology NAS KR operated on the territory of Kyrgyzstan, the data of which are very important for seismic monitoring purposes. Most of these stations were installed far from natural and anthropogenic sources of seismic noise, on bedrock, in specially equipped bunkers and tunnels. In addition, most of the nuclear explosions from the Asia test sites, as well as peaceful nuclear explosions on the territory of the USSR, were carried out at regional distances from the stations of the network. Metadata about seismic stations were collected, including coordinates refined by modern methods, amplitude-frequency responses, etc. More than 15 000 seismograms of nuclear explosions and earthquakes were scanned with a resolution of 1200 DPI, and the records are being digitized. According to historical seismograms, a seismic bulletin was created for seismic records of nuclear explosions from the area of the Lop Nor test site, magnitudes mb, mpv, MLH and energy class K were calculated. More than 500 seismograms were processed (1966-1996), and analysis of the dynamic parameters of seismic records of nuclear explosions depending on the explosive yield and the explosion environment were carried out.
One of the most valuable sources for ground truth events are peaceful nuclear explosions as these were conducted in different geological environments on a vast territory. Temporary stations were deployed very often to record these events, and, in addition, in recent years work on the precise coordinates using contemporary methods was conducted. Using the records of historical analog seismograms of 1965 – 1988, a seismic bulletin of peaceful nuclear explosions conducted on the territory of the former USSR was compiled. In total, more than 1500 seismograms were processed. For this study, the archive records of the CSE IPE AS USSR network, whose stations were installed on the entire territory of the former USSR at an epicentral distance of 520-6700 km away from peaceful nuclear explosions, were used. The regional travel time curves of the main seismic phases were constructed by data of calibration sources for different regions of Kazakhstan. These were tested and compared with other regional and global travel time curves for this territory. The dependences of the regional magnitudes on the explosions yield were constructed. It was shown that the dependence is affected much by the medium in which the explosion was conducted. The obtained results can be used for the seismic stations calibration and in nuclear tests monitoring tasks.
At the present moment, for many regions of the world, one of the topical issues is the prediction of industrial blasts seismic effects on the infrastructure and adjacent settlements. The IGR conducts this kind of work using the historical records of explosions and description of its effect on constructions as well as using the data of field experiments on mining explosions recording at the region of mineral deposits production quarries. The records of large chemical and nuclear explosions were collected; for these explosions, the typical buildings were especially constructed at the near zone, and strong-motion accelerometers were installed at different floors. In Kazakhstan, for instance, such investigations were conducted for large chemical explosions at the construction of the mudflow dam near Almaty, at the Medeo mountain area (1966, 1967), and for a peaceful nuclear explosion Lazurit (1974) at the Semipalatinsk Test Site. In recent years, using the data of the field seismic stations installed at copper and rock phosphate deposits, the effect of different yield explosions on the infrastructure of quarries and constructions in the nearest settlements was studied in quantitative oscillation parameters and in seismic intensity units. The values of seismic effect on the constructions depending on the distance and explosive mass were calculated.
During the past decades, Air Force Technical Application Center geophysicists, recognizing the importance of preserving analog historic data, have led various internal efforts to rescue and preserve legacy geophysical records. We will discuss the efforts and approaches we use to collect and integrate historical nuclear explosion data into our systems and to leverage its use to support our operational and research needs. The development and use of uniform and systematic quality control measures and metadata standards is an integral part of our prioritization processes for inclusion of event data into a curated reference database. We also developed procedures to identify ‘continuous stations’ (stations that recorded historic events and continue to operate in the present) that may provide long term empirical data. In addition, we will share some of the challenges and lessons learned in our race against time to preserve and maximize the use of this unique, invaluable and irreplaceable resource while facilitating the transfer of technical and scientific knowledge from older to younger generations.
The CTBTO link to the database of the International Seismological Centre (ISC) is a service provided on behalf of and by arrangement with the International Data Centre. The link provides the Provisional Technical Secretariat (PTS) and National Data Centres (NDC) with dedicated access to long term definitive global datasets maintained by the ISC. Functionality includes specially designed graphical interfaces, database queries and non-International Monitoring System (IMS) waveform requests. This service gives access to the ISC bulletins of natural seismicity of the Earth, mining induced events, nuclear and chemical explosions (specifically 1085 nuclear explosions from 1961 - 2017); the ISC-EHB dataset; the IASPEI Reference Event list and the ISC Event Bibliography. The link is a useful tool for accessing historical event data which can be used for training and calibration work commonly carried out by the NDCs and PTS. The searches are tailored to the needs of the monitoring community and allow for area-based searches and comparisons between the ISC bulletin and the Reviewed Event Bulletin. Additionally known historical nuclear explosions can be searched as a subset of both bulletins. Recently we have developed an expanded station search function for both IMS and non-IMS stations allowing the user to check suspected time periods of inverted polarities at openly available seismic stations.
The Geophysical Survey of the Russian Academy of Sciences and Michigan State University are recovering, scanning, and digitizing the historic analog seismograms of Soviet Peaceful Nuclear Explosions (PNEs). The Soviet Union detonated 122 PNEs from the mid-1960s through the late 1980s. The PNEs were conducted in a wide range of geologic settings and geographic locations, thus representing a unique data set for geophysical studies. These explosions were well recorded by the regional seismic networks, where thousands of seismograms are retained. We are generating high resolution scans of the seismograms and digitizing the waveforms for analysis. Along with the seismograms, we are recovering the original station calibrations, responses, and metadata for each station and developed code to generate Dataless SEED files for use with the digitized data. Most seismograms are from short period instruments, and when combined with the correct station calibration information, the digitization process accurately recovers ground motion signals to at least 5 Hz. The resulting digital waveforms are of high quality and are usable for quantitative research.
The Leo Brady Seismic Network (LBSN) is a small regional seismic network established in 1960 by Sandia National Laboratories to monitor US underground nuclear tests at the Nevada Test Site (now known as the Nevada National Security Site). Until the mid-1980s, data recorded by this network were telemetered as frequency modulated audio from each seismograph over telephone lines to a central location where the signals were then split — one path going to traditional seismic paper records and the other to analog magnetic tapes. While the paper records have long since been lost to history, the tapes have survived. For the past several years, we have been working to digitize these tapes and recover and calibrate waveforms from them. Herein we present the first major results of our recovery efforts, showing newly recovered seismic waveforms which, until today, have not been seen for more than half a century and have never been available in digital form. We also describe the recovery process and data quality associated with such old seismic records. These “new” old waveforms provide additional data to validate International Monitoring System models and algorithms. SNL is managed and operated by NTESS under DOE NNSA contract DE-NA0003525
The Institute of Geophysical Research of the Republic of Kazakhstan conducts work on scanning and digitalizing of archived photo paper seismograms from the CSE IPE AS USSR. The records are digitized using WaveTrack software developed by the Novosibirsk Regional Center for Information Technologies of Russia and adjusted for data saving in contemporary formats. The main task of the project is to save the historical seismograms of nuclear explosions conducted at the Semipalatinsk Test Site recorded at regional distances and convert them into research quality waveforms. The WaveTrack digitization results were compared with records digitized previously by NXSCAN software developed by IRIS, USA. The comparison shows some advantages of the new software. The digitalized historical seismograms are used in a range of tasks related to nuclear test monitoring. The report shows the study results of dynamic parameters of digitized seismic records, energy and magnitude calculations of underground nuclear explosions (UNEs) conducted at Degelen site in the same tunnels. For UNEs, the decrease of seismic effect is observed for the second explosion conducted in geological medium partially destructed by the first explosions. Thus, a clear dependence of seismic effects as a function of the degree of destruction and fracturing of rocks is observed.
Automatic bulletins are created using data from the International Monitoring System (IMS) seismic, hydroacoustic and infrasound stations. These bulletins are interactively analysed and reviewed by the Monitoring and Data Analysis (MDA) Section at the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The automatically processed Standard Event Lists (SEL3) are reviewed in detail to modify phases, parameters, and locations; raw data from all stations are also analysed and scanned to build missing events in the automatic process. This routine analysis results in delivering the Late Event Bulletins (LEB) and the Reviewed Event Bulletins (REB), which are released on a daily basis since 1999. However, over the last 25 years, the number of operational IMS stations has grown. In addition, there have been changes, updates, and improvements in the technologies, methods, and analysis tools. Here, we present statistics of the automatic processed bulletins as well as the output LEB and REB for the past 25 years. Moreover, we show how the network growth has affected the bulletins in terms of the number of detections. We also illustrate how the new tools could affect both automatically processed and reviewed bulletins. Furthermore, we discuss the routine interactive analysis challenges along with suggestions on how to overcome these.
During the first half of the 20th century, most earthquake ground motions in Mexico were recorded by the Mexican seismological network, equipped with different Wiechert mechanical seismographs on smoked paper. Nowadays, the analogue seismographic collection of Mexico is stored and maintained at the Joint Library of Earth Sciences of UNAM, as part of the Sismoteca-SSN project, to preserve, digitize and reuse these legacy data. Due to the availability this collection, we have found different smoked paper seismograms at the Tacubaya (TAC) Central station for the period from 5 to 7 July 1962. On 6 July 1962, a shallow underground nuclear test was conducted in the Yucca Flat, Nevada. This event, known as the Storax Sedan nuclear test had a yield equivalent of about 104 kilotons TNT. After visual inspection of the seismograms from the astatic horizontal Wiechert seismograph (17 000 kg) in TAC station, at a distance of 2560 km away from the nuclear test site, different possible seismic traces could be related with that nuclear event. In this work we analyse the seismic traces recorded in these smoked seismograms, in order to define if this explosion could have been recorded at this distance by a high sensitive mechanical seismograph in Mexico.
The detection level of a seismic network is the ability to estimate the level of detection and have a tremendous importance both in the design of a new network and in determining whether a given network can recognize seismicity consistently, or needs to be improved in some way. We determine the detection level of the Cuban seismic network using the empirically estimated seismic noise spectral level at each station site and some theoretical relationships. The minimum local detectable magnitude thus depends on some network parameters such as the signal to noise ratio and the number of stations used in the calculation. We also demonstrate the effectiveness of our predictions by comparing the estimated detection level with those empirically determined from one year of data of the Cuban seismic catalog. Our analysis shows, on the one hand, which areas the current Cuban network should be improved, also depending on the regional pattern of faults, and, on the other hand, indicates the magnitude threshold that can be assumed homogeneously for the catalog of Cuban earthquakes in 2020.
In the framework of application of Comprehensive Nuclear Test Ban Treaty, 371 facilities run across the world to form the International Monitoring System. Madagascar hosts two of these stations, I33MG an infrasound station and AS61, an auxiliary seismic station. These two stations need to operate continuously to detect any nuclear test. To minimize downtime during an eventual outage, the National Data Centre Madagascar via its local operator team developed an alert system using Raspberry Pi and a GSM module. In case of an outage of up to 30 minutes, the system sends an SMS via local operator to the team showing the last received frame and the team start to solve the issue.
The Geophysical Institute in Ecuador uses SeisComP for acquisition of waveforms of around 150 seismometer and accelerometer stations. SeisComP has several modules (SCQC, SCQCV, etc.) that allow the operator to check the quality of the waveforms acquired, nevertheless the module's interfaces could be difficult to use and not very interactive. For easy access to the information of waveform quality control (QC), we developed SCZABBIX, an open source Python module, that sends the QC information created by SCQC to a Zabbix server so we can leverage on the whole set of tools provided by the open source software Zabbix, like interactive graphs, thresholds, triggers, alerts, etc. This allowed operators to detect problems in transmission, estimate parameters like data availability, ping response time, latency, etc. in a quick and flexible way for historical or real time data. Additionally, we developed SOH_ZABBIX, which are some Python modules that capture state of health information from dataloggers like Quanterra, Reftek, etc. and send it to Zabbix so we can test its new triggers for automatic anomaly detection and determine if they could be used to anticipate problems related to battery, temperature, use of disk, etc.
Many of the primary and auxiliary seismic stations of the International Monitoring System (IMS) have been in operation for more than 20 years. Tracking changes in station performance for this duration can prove challenging, requiring continuously reported data across the time period. Using phase picks reported to the International Seismological Centre (ISC) it is possible to evaluate historical changes in individual stations performance. Metrics which are used to track changes at the stations include; average monthly travel time residuals, evaluation of travel time residuals by distance and hypocentre location and comparison of reported polarities with available earthquake mechanism solutions. These metrics can indicate when a station polarity was inverted or highlight an improvement in travel time residuals occurring over a short time period. We will show how best to access this information using the CTBTO link to the ISC database and highlight historical examples of changing station performance. Additionally, where station response files are publicly available for IMS stations we will demonstrate independent response file checks currently being developed by the ISC.
An analysis of the quality of accelerograph data has been carried out from 669 station locations. This aims to monitor and evaluate the data quality of accelerograph equipment at the Meteorology, Climatology, and Geophysics Agency (BMKG). Advances in machine learning can mean the ability to diagnose or even predict equipment problems so that site visits can be scheduled more cost-efficiently. The analysis was carried out by machine learning using accelerograph data in Indonesia. The study compares two machine learning models: accelerograph data analysis using Multilayer perceptron (MLP) and k-nearest neighbor (KNN). The output of machine learning models consists of two types of classification, namely four class output data quality in general, namely good condition, fair condition, bad, and off condition, at the output seven class output represents problems with the tool, such as high noise above the accelerometer noise model, sensor damage, etc. The analysis of the two models shows that MLP has a balanced accuracy of 85.5%, and KNN has a higher balanced accuracy of 91.2%. So in this study, a machine learning model was obtained to identify problems with the accelerograph equipment and evaluate network data quality for observing strong earthquakes due to seismic waves at BMKG.
The International Data Centre seismic, hydroacoustic and infrasound (SHI) reengineering project began in 2014 with the goal of creating modernized, open-source software for SHI processing, and improving maintainability and extensibility to the system. Since 2019, the project is under active development, and the current main area of work is the elaboration of a modernized station state of health (SOH) system. This system monitors in real time the performance and status of the International Monitoring System. The future system is based on the Geophysical Monitoring System being developed for the US National Data Centre and shared as a voluntary contribution. It is being extended to address Provisional Technical Secretariat-specific monitoring requirements. This poster introduces the general architecture and highlights recent progress on the implementation of the new SOH monitoring features.
Simulating and analysing the detection capabilities of the seismic, hydroacoustic and infrasound networks of the International Monitoring System (IMS) can contribute to the enhancement of the Treaty’s verification regime and its capabilities. Simulations carried out with the Java-based tool Network Monitoring for Optimal Detection (NetMOD) allow one to evaluate the overall network performance of the IMS and assist the International Data Centre (IDC) in continuously monitoring, assessing, and reporting on the operational status. This quantifies the variations in detection capability over time and studies the impact of several parameters on the network coverage (e.g. seasonal variation, weather conditions, station noise levels, outage of key stations, period of large events). From one side, the results can help to evaluate the resilience of the network in homogeneously monitoring the globe and assist in analysing events of interest. On the other side, the results can provide input to the IMS maintenance and recapitalization by assessing the importance of a station installation or station failure on the overall performance of the network. NetMOD is developed by Sandia National Laboratories and is provided to the Provisional Technical Secretariat as a voluntary contribution in kind of the United States.
The International Data Centre (IDC) receives from stations more than ten gigabytes of raw data on a daily basis. Most of those data are collected and transmitted as continuous data digitally signed frames. In normal operations the frames are received in chronological order of their acquisition/detection time. However, some frames may be missed, corrupted, partially delivered or sent several times. It is vital to Treaty verification to receive the data in a timely manner and that it is complete. In this paper we propose an artificial intelligence, temporal data mining and modelling approach for monitoring, exploring and analysing continuous data. The proposed temporal continuous data model is designed using a well defined knowledge based system and inference engine. Also, with the proposed approach by mining the temporal data contained in the concise list of frames, we can easily calculate, in an hourly basis or less, the timeliness, data availability and completeness of the received data. In addition, it will enable us to discover hidden patterns, evaluate receiving and forwarding processes, asses the station operation as well as generating distribution performance indicator metrics. The generated results and metrics can be fed to big data tools (e.g. Elasticsearch) as well as using open source dashboards for data presentation.
The International Monitoring System (IMS) is a globally distributed network of monitoring facilities using sensors from seismic, hydroacoustic, infrasound, and radionuclide technologies. The need for data availability increased the significance of data quality monitoring. This task is addressed through the creation of a platform for the fast and easy identification of data acquisition problems, as well as regular activities to ensure that station characteristics meet the requirements of the IMS draft Operational Manuals. The Provisional Technical Secretariat (PTS) has developed a Multi-Technology Integration Portal ( MuTIP) internal platform with a variety of monitoring tools to facilitate data quality monitoring. MuTIP facilitates PTS user access to information allowing technical review of data quality and analysis through the aggregation and representation of technical results from a variety of internal PTS data sets and databases. MuTIP provides information such as station summary (PSD/PDF, Residual, and Calibration). MuTIP has also interfaced with Grafana (SOH), PRTools, JIRA instances (IRS, ITS, SHD), CAMT, DOTS, and International Data Centre databases. MuTIP provides good support for data quality monitoring in terms of troubleshooting (reliability, efficiency, accuracy, and timeliness) to maintain good station/IMS network performance in daily OPS data monitoring. Ensuring that the station is operating properly, especially in meeting data (availability, surety, and quality) requirements.
Team work with IDC OPS colleagues.
Scheduled calibration activities at International Monitoring System (IMS) seismic, hydroacoustic T phase and infrasound stations are part of the quality control requirements defined in the draft IMS seismic, hydroacoustic and infrasound Operational Manuals. Depending on the technology used, the approach for scheduled calibration can be different in some aspects. However, the overall objective remains the same: controlling the quality of IMS measurement systems by comparing their measured sensitivity and response against their nominal values defined at the time of station certification or revalidation. Results from calibration activities are reported by the CTBTO to States Signatories on a yearly basis. This presentation shows processes in place to implement calibration activities, current and historical calibration results, and associated progress and challenges.
The International Monitoring System (IMS) is one of the four elements of the Provisional Technical Secretariat (PTS) verification regime. The seismoacoustic component of the IMS network is comprised of seismic, infrasound and hydroacoustic stations participating in the underground, air, and underwater explosion monitoring. The minimum requirement in terms of data availability for each station is 98%. This means that each station should be able to send data to the International Data Center over 98% of the time, regardless of the challenging environment in which they are located. When an issue occurs at a station, it is discussed through the IMS Report System (IRS). Failure analysis based on information in the IRS has been performed since Nov. 2011 with the objective of triggering the required engineering activities, initiating further analysis when needed (root cause analysis), using trends to anticipate future failures, and verifying that the implemented engineering solutions led to improvements in reliability. This poster presents historical results on data availability metrics, categorization of failures leading to station downtime, and engineering projects aiming to enhance station robustness to downtime.
Revalidation of International Monitoring System (IMS)stations is a rest assured approach to demonstrate compliance with IMS minimum requirements for stations that underwent major upgrades or equipment changes that have caused station response modification. The Configuration Management Board (CMB) decides on the
need for the station revalidation after the case submission to CMB by IMS Engineering and Development Section of the CTBTO. Upon completion of the modernization works at the station the revalidation report is submitted by IMS/ED for review of the Certification Group. The report content is standardized demonstrating the updated technical specification Table and description of the testing process. The report is lean on the compliance demonstration in the following areas: noise study analysis using a newly developed calculator for different equipment settings; station calibration; testing of the status bit settings; compliance with cd1.1 requirements for data transmission and data surety aspects; station documentation update and DOTS population. This presentation demonstrates examples from revalidation reports with a demonstration of the key compliance with IMS requirements of ever changing technical solutions applied to the station upgrades in recent years.
Broadband access and low latency are available globally thanks to the increasing availability of low earth orbit satellites. There are a variety of geographical locations where the International Monitoring System stations are installed that could benefit from this type of service. Maximizing data availability and simplifying complex infrastructure -normally associated with the deployment of VSATs- and providing innovative services and applications through broadband availability would enhance not only transmission of data but also the customer experience available to a station operator in remote regions. Currently, GCI III is moving through its mid life cycle and technology refreshment; the GCI IV planning will begin in the mid term, with initial deployments within five years.
Currently there are three constellations of satellites in different degrees of deployment with some other providers in the planning stage. This paper presents the technology and explores its potential application in selected remote locations to provide an overview of potential improvements in the transmission of data. It also provides the possibility of novel applications for station operators that are typically subject to bandwidth limitations and latency issues.
Accurate seismic station orientation is important for seismic waveform or particle motion based studies. Station misorientation has been studied using P wave particle motion and Rayleigh wave arrival angle, which may be affected by the complexity of earthquake sources. Recently, P wave receiver function (PRF) using tangential component has been proposed for misorientation estimation. In this study, two new methods using amplitude and energy based objective functions of both radial and tangential components of PRF to estimate station misorientations, are introduced. The two methods are tested using 21 stations in West Africa and adjacent islands, taking into consideration the effect of the different geologic terrains i.e. islands, coastlines, and inland areas. The difference between the two methods are investigated, considering the effects of Gaussian factors, back-azimuthal coverage, number of events, and structural heterogeneities. Both methods produce similar results, which are consistent with a previous study. However, comparison between radial and tangential components indicates that results based on radial components are more stable than tangential components which may be more sensitive to heterogeneities and background noise. Our study suggests that both methods produce more consistent results for inland stations compared to islands/coastal stations, and objective functions for radial components are more stable.
The Seismic Network Expansion in the Caucasus and Central Asia (SNECCA) project began in 2019 to upgrade national seismic network capabilities in Armenia, Azerbaijan, Georgia, Kazakhstan, Kyrgyzstan, and Tajikistan. The United States participated in this collaboration with support from the U.S. Department of Energy and Lawrence Livermore National Laboratory. All six countries utilized a common design concept for broadband station installations. The design consisted of steel- or plastic-cased postholes (shallow boreholes) with above grade electronics to achieve a robust, secure station with a well installed sensor. The station design was based on the successful deployment and operation of several hundred stations in Alaska and western Canada as part of the U.S. National Science Foundation’s Transportable Array project. Station construction was adapted to the drilling capabilities and infrastructure available in each country. The design objective was to create low noise, robust, uniform stations that will operate reliably and securely, using a repeatable and consistent station construction and installation process. In this presentation we will cover the design concept, realizations of this concept achieved in each country, and preliminary data. The SNECCA project is implemented through the Seismic Targeted Initiative of the International Science and Technology Center and the Science and Technology Center in Ukraine.
For most of the inspection area, there would be no electricity supply. The inspection team should be self-sufficient especially for the perspective of electricity supply, since that most of the mission critical or health and safety equipment would rely almost solely on electric power. This work proposes a portable flexible solar charging pack for on-site inspection activities. Flexible solar cell is a new type of energy material which is flexible in physical nature and its power capacity could be enough to provide emergent charging supply to the inspectors’ portable equipment during field operations. Via cascade connection, several solar charging units could jointly provide more power supply to other power-consuming equipment. The folding size of one solar charging unit is 29cm16.5cm, with the expansion size of 120cm29cm. Its power and weight are 37w and 0.48kg respectively. As a result, a single solar charging pack consisting of three solar charging units would have more than 100w of power capacity, with USB or other customized output connectors. The portable flexible solar charging pack has been tested under hot, humid and extremely cold environments and has other characteristics of waterproof and high reliability. The working team is looking forward to the practical test trial at the upcoming 2025 Integrated Field Exercise.
In close and constant collaboration with the CTBTO, Enviroearth has developed a backup intra-site communication solution working through a cellular network. This backup system was fully designed according to CTBTO specification. Its low power consumption, communication protocol and easy implementation on-site make it well adapted to the infrastructure of infrasound stations and its integration to current main communication system (radio, fiber optic, copper wire/ethernet, VSAT). This cellular solution will allow a permanent redundancy and ensure a continuous communication between CRF and remote elements in case of the main communication system failure/outage, thanks to its automatic failover mode. The cellular backup system will run within pre-established cellular network (e.g. 2G, 3G and 4G) and through a sever infrastructure with VPN tunnel management.
One challenge of on-site inspection (OSI) is preparing for deployment to unknown locations with unknown resources. A critical element of a functional base of operations (BOO) is stable and reliable electricity. The current component for the power system for OSI include diesel generators, UPS and hybrid power system that can accept power from other generation sources, such as solar. UPS systems allow for quiet time without the diesel generators reducing fuel usage and CO2 production. The system is designed to be resilient and has management software to switch power sources depending on the power consumption at any point in time. The BOO systems are also supplemented by field deployable power banks and solar panels for powering technical equipment such as the SAMS mini arrays and noble gas samplers.
Brazil participates in the CTBTO in several ways, as it has an institution, the National Center for Monitoring and Alerts of Natural Disasters (CEMADEN), which became responsible for the implementation of the Brazilian National Data Center (NDC). The country also has a radionuclide laboratory, RL4, which is in the certification process. In addition to operating an infrasound station, IS9, and a primary seismic station, both in the city of Brasília, we have two seismic auxiliaries , AS10 and AS11. We also have a radionuclide station in planning, RN12, and another in operation and certified, RN11. RN11 is operated by the Institute of Radioprotection and Dosimetry and the experience gained over the years is presented, emphasizing good practices, maintenance, training and lessons learned, facing the health crisis (COVID-19), management, obtaining and using equipment and spares. And the legal limitations and human resources will also be presented to share the experience acquired in the management of RN11.
All CTBTO stations in Indonesia are powered by solar panel although some are helped by commercial power lines. All those systems require a battery bank. Since 2008 until 2021 there was 10 cases caused by battery problem which held up data transmission from six CTBTO stations in Indonesia to the National Data Centre and the International Data Centre. In this presentation we will discuss how to choose the right batteries, how to manage them, when it is time to replace them and how to maintain them to support sustainable station operation and reduce power problems
The emerging technologies of Li-Ion batteries offer new design possibilities for remote monitoring station power supplies. Based on recent installations and testing of these batteries and complementary equipment, the Engineering and Development Section of the
International Monitoring System Division presents the results of both field and desktop studies, describing the advantages and disadvantages of using LiFePO4 batteries, as well as general design considerations for use of such technologies within the International Monitoring System network.
NORSAR is the Norwegian National Data Centre (NDC) and operates six stations of the International Monitoring System. These are the primary seismic arrays NOA/PS27, ARCES/PS28, the auxiliary seismic array SPITS/AS72, the auxiliary single seismic station JMIC/AS73, the infrasound array IS37 and the radionuclide station RN49. It is crucial to have a stable power supply with a sufficient battery bank capacity to the stations and the auxiliary station AS72/SPITS was refurbished to secure stable operation for the future. This presentation will give an overview of NORSARs choice of system and the work with replacing the previous power supply system. The old system consisted of a lead acid battery bank, two stirling generators, solar panels and a wind turbine. The new system is a complete UPS system from Eltek (a Delta group company) with lithium-ion batteries, new solar panels and Cummins diesel generator in addition to the wind turbine.
The correct operation of a seismic station depends on many factors. A crucial one is the provision of power required for station to operate at the established standards. Given that the Paso Flores, PLCA/PS01, seismic station power system currently in use is at the end of its life cycle, that it was not designed for the current demands and, evaluating all the technologically available alternatives for replacement, it is presented a design based on the use of solar panels as main source, a bank of batteries to store energy and a thermoelectric generator without moving parts, as a supplementary energy source to work during the winter months when solar radiation is low.
Innovative deep ocean monitoring technologies are crucial to catalyzing fundamental improvements in mitigating natural disasters, reducing human vulnerabilities, and determining environmental threats. An attractive but untapped resource is the global submarine fibre optic cable network, which carries over 95% of international internet traffic. Key components of undersea fibre optic cable systems are repeaters, which are placed every 60-100 km along the cable to provide optical signal amplification, and provide a unique opportunity to deploy sensors globally. Installing Sensor Monitoring And Reliable Telecommunications Cables (SMART) repeater based sensor systems that include seismic, pressure, and temperature sensors will revolutionize our ability to monitor earthquakes, nuclear tests, tsunami, global climate change, and the security of major telecommunications infrastructure. While the SMART concept has been discussed and evaluated for over 10 years, developing a SMART repeater requires substantial research and development investment to validate the technology. Subsea Data Systems has developed a fully operational prototype SMART repeater sensor system. Our system includes best in class sensors validated by the scientific and monitoring communities, coupled with the data formatting and transmission frameworks already accepted by these communities and in extensive use worldwide. Here we present our ongoing efforts to make fully validated SMART cable systems a reality.
The security system designed for the Internet of Things (IOT) should be able to detect and prevent both internal and external attacks. This work proposes a lightweight and low energy encryption algorithm to secure data over wireless networks. The proposed algorithm meets the requirements of data and is suitable for wireless devices and sensors. It is capable of reducing the execution time and power consumption of the encryption process compared with the state of the art standard algorithm and at the same time maintains the desired security (confidentiality) level. A comparison between the proposed algorithm, the advanced encryption standard (AES) and other lightweight algorithms proposed by other research works is conducted and the results are promising. Using the proposed algorithm, a significant amount of time and energy consumption reduction is achieved to reach approximately 35% improvement over the standard AES algorithm accompanied with good level of complexity in the encryption process, making it more suitable for the wireless environment. The aim of this study is to reduce the power consumption of implementing encryption algorithms on the sensors using in the International Monitoring System. Accordingly, increasing work capacity of the sensor's battery.
"TianQi" LEO micro-satellite Internet of Things (IoT) constellation, is a commercially running space based IoT system, which is composed of 38 LEO micro-satellites, and will be fully deployed by the end of 2023. Ground terminals are portable and cost effective, with power consumption as low as 100mw. The constellation could provide global users such as CTBTO with "air, ground and sea integration" satellite IoT data communication service. Each micro-satellite weight is about 20-50 kg with a power consumption of 50-100W. The micro-satellites would be launched into circular orbit of 500-900 km altitude, which would bring advantages such as short transmission delay, low path loss and low power consumption, low cost and miniaturization of user terminals. "TianQi" LEO micro-satellite IoT constellation would establish a global coverage 0.2kbps to 6kbps narrow bandwidth communication network, capable enough to provide varieties of solutions to CTBT application scenarios, such as operation and maintenance of remote International Monitoring System facilities, emergency communication and health and safety of on-site inspection team and inspectors, real time tracking and CoC solution to on-site inspection samples, etc.
In a constant evolving digital world, there is a need for flexible way to secure the CTBTO data and products sent over the Internet. High levels of flexibility can be achieved by implementing programable, intelligent, VPN systems that can be deployed without the need for dedicated physical hardware. This system of secure communications can be deployed in as a software implementation, eliminating the need for physical network infrastructure, either on premises or in a platform as a service environment. There is only one requirement: Internet access. VPN HUBs offer a highly flexible design model:
hardware or software or a mix of both - the entire VPN system can be recovered in a short time with a high level of automation. A VPN system requires minimal human interaction to deploy, thus reducing the probability for errors. The entire system can be backed up daily on a USB stick and redeployed fast on case of emergency. In essence, a solution to securely send data and products to National Data Centres without the need for physical networking equipment.
As a solemn legal proof of either Treaty compliance or violation, data security of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) verification infrastructure plays a vital role for the authority of CTBT regime. For most important cases within the CTBT verification infrastructure, such as the International Monitoring System (IMS), data gathered at internationally distributed IMS stations and transmitted through GCI to the International Data Centre (IDC) in Vienna, on-site inspection (OSI) voice communication and data transmission among inspectors, base of operations (BOO) and the CTBTO, there is a need to make a standing arrangement to prevent data from being eavesdropped on or intercepted. This work would carry out a full-range solution to IMS and OSI with secured data transmission under various network communication scenarios. Link encryptor provides protection for digital data transmission on trunk lines such as microwave, satellite and troposcatter through the encryption of all data including the protocol itself. IP encryptor is designed to protect IP communications over TCP/IP network by adopting encryption, identity authentication and data integrity verification. Radio encryptor is designed to work with HF/VHF/UHF radios for secured voice and data communication.
Sandia National Laboratories is developing the Geophysical Monitoring System (GMS) to modernize the United States National Data Center (USNDC) waveform processing system. The United States is providing the common architecture and processing components of GMS as a contribution in kind to accelerate progress on International Data Centre (IDC) re-engineering. A key aspect of GMS architecture is the Common Object Interface (COI) specification, which includes a data model and an access application programming interface (API). The data model is a collection of data structure definitions describing all the data stored by GMS and the access API specifies the available query and storage operations. The COI specification is independent of any storage solution. This decoupling facilitates changes to the storage solution with minimal impact to the GMS application software; likewise, the data model and GMS applications can be updated with minimal changes to the storage solution or schemas. One GMS COI implementation, the data bridge, provides clients access to legacy system data through web services. This poster describes GMS COI and data bridge architecture.
The Flexpart Atmospheric Transport Model is used in daily operations at CTBTO for the backtracking of radionuclides based on measurements at 80 sites worldwide. Over 280 simulations are performed each day and it is desirable to make each simulation run in a shorter period of time. With increased availability of GPU environments, it is natural to consider their potential in speeding up the Flexpart simulations. Work is currently underway to transform a CTBTO customized Flexpart v10 into a code that can efficiently use GPUs. To date much effort has been expended in getting this code (and its dependencies) to compile and execute in an NVIDIA Fortran (nvfortran) environment, leading to discovery of bugs in the nvfortran compiler, resolved by NVIDIA as a result of our reports. While awaiting resolution from NVIDIA, we have performed detailed profiling on smaller problem sizes in order to identify the “hot spots” of the code where GPU optimization might provide the greatest benefits. Work is now underway to isolate the “hot spot” regions of code into test modules suitable for extensive iterative experimentation, development and testing. This work is ongoing, and this presentation will describe a detailed overview of the project and its current status.
Smartphone accelerometers can be used to record earthquake signals to support disaster mitigation in Indonesia. Human activities produce significant noise to accelerometer data on smartphones. Human activity recognizer (HAR) functions to sort out human activity signals from earthquake signals recorded by smartphone accelerometers. This study aims to reduce the linear acceleration signal of human activity on the Android smartphone accelerometer. The HAR application as a smartphone accelerometer digital filter includes several design stages, namely data collection, feature extraction, activity data classification and determining the Butterwoth type filter design based on activity data. These activities include sitting, standing, lying down, walking and running. The Butterworth type digital filter is designed based on the dominant frequency of the human activity acceleration signal. This filter is installed in the server program and then tested against activity signals carried out in the Central BMKG earthquake simulator. The test results show an increase in the percentage of earthquake signals from 3.87% to 64.61% and a decrease in the percentage of human activity signals from 96.13% to 35.39%.
The International Data Centre (IDC) is using critical and complex systems with several distributed processes for receiving, processing, disseminating and archiving data and products. A dedicated tool called WorkFlow is used, among other tools, to monitor those systems. The tool is only available in Linux machines, uses intensive database resources, is hard to maintain and requires expertise for configuration. We are prototyping a new modern, embedded, web based, configurable, user friendly and portable monitoring service to help the IDC staff and its operations centre for effectively and efficiently monitoring the different systems. The states of the different processes are collected and cached independent of the IDC databases and can be shared among clients without exhausting the IDC databases. The front end is based on latest Hypertext Markup Language (HTML5) specifications and JavaScript. While the back end is implemented by Python and uses Tornado web framework. Also, several different views, flow types, roles, and permissions can be implemented. Additionally, the service is capable of and can be configured to monitor the states of any temporal data. The different capabilities of the service will be presented. Also, the advantages and limitations of the new service are outlined. Finally, future directions and recommendations will be discussed.
The Meteorology, Climatology and Geophysics Agency (BMKG) is a government agency that has the task of providing data and information services for meteorology, climatology and geophysics as well as delivering information to the public and related agencies. BMKG empowers and relies on the VSAT network which works 24/7. The BMKG VSAT communication network is a system that connects all systems within the integrated BMKG environment and requires a real time data exchange process. The public is increasingly critical in demanding that BMKG information be obtained more quickly, precisely, accurately, easily understood, and can reach all corners of the country. The paralysis of the VSAT network in the BMKG environment is critical because it can cause communication paralysis both internally and externally at BMKG. Therefore, the development of the BMKG must be carried out in an integrated manner accompanied by backup (disaster recovery) so that the dissemination of weather information and disaster events for the community is not disrupted. Data collection was carried out by conducting literature studies, conducting interviews with users and administrators of BMKG assets, observing BMKG assets, and studying organizational documents. The results of this study are DRP documents that are adjusted to the BMKG duties and functions.
Traditional sensing techniques are recognized to be labor and technically intensive for practical adoption. As a result, researchers are employing alternate sensing techniques. One of the major technologies for sensing environment is variations in WiFi channel state information (CSI) that describes the channel state between the transmitter and the receiver. With its sensitivity to environmental dynamics, due to it ubiquity WiFi signal is now widely used for a variety of sensing applications in indoor environments, such as gesture recognition and fall detection, in addition to its primary usage for communication. WiFi based sensing offers non-contact sensing in a private mode, simultaneous sensing and data transmission without additional connection infrastructure, and remote sensing without handheld sensors. In this paper, we propose a new approach based on data collected from WiFi-CSI analysed in pretrained machine learning model to detect precocious signs of earthquake, caused by the first energy emitted by an earthquake, the P waves, which rarely cause damage but cause some environment changes like some ceiling lights move and some static objects shake slightly, movements that affect electromagnetic waves and can be sensed with wireless signals through CSI .
Capacity building efforts for National Data Centers (NDCs) commonly involve the provisioning and shipment of physical hardware systems and the training, installation, maintenance, and distribution of the NDC-in-a-Box (NIAB) software suite. “NDC-in-the-Cloud” (NIAC) is a set of cloud-hosted resources for National Data Centers that uses cloud services to streamline these efforts, while also offering new flexibility and reliability to users. It provides NDCs with cloud based versions of the familiar NDC-in-a-Box software and data, as well as a blueprint to create secure cloud based resources to use these data and software in their own cloud accounts. With this approach, we seek to increase access and processing capability for NDCs, and decrease any logistical burden associated with shipping physical hardware. The NIAC prototype employs Amazon Machine Images for software distribution, elastic compute cloud virtual hardware in the Amazon Web Services commercial cloud for computing, virtualized desktops accessible via web browser, and an “Infrastructure-as-Code” approach to deploy cloud resources. Internal and external evaluations of NIAC instances note good desktop responsiveness and adequate computing and storage capacity, and we seek to refine the prototype through further collaborations.
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
CTBTO is hiring professionals in STEM fields – Learn about CTBTO’s Recruitment Process
Come and find out more about how to join the CTBTO! Visit us at the HR booth and speak with the CTBTO Human Resources team and learn more about CTBTO careers. You can also join us in our HR presentation where we will walk you through our recruitment process. CTBTO is hiring top talent across a wide variety of scientific and technical fields in seismic, radionuclide, hydroacoustic and infrasound technologies. There will also be a presentation focusing on students and young professionals interested in pursuing a career in these fields. We look forward to meeting you soon!
The On-Site Inspections (OSI) are the final verification regime of CTBT. Its principal purpose is to detect if an underground nuclear explosion (UNE) has been carried out. Several geophysical techniques are used on a suspicious area to detect cavities, holes or devices that indicate the realization of a nuclear essay. One of these techniques is seismic reflection. There is scarce information in the literature about the seismic response of caverns created by UNEs, for this reason, in this work we use wave-propagation simulations to obtain the seismic signature of these types of caverns. We solve the elastic wave equation in the time domain using the spectral element method with fourth order basis functions and quadrilateral elements. We show the results for models with cavities and caverns corresponding to explosions of 1, 20 and 100 kilotons. In seismic traces we can observe the effect of these structures and we can see that cavities behave like points or diffractor bodies, depending on their sizes, whereas caverns, which are more complex structures, behave like two concentric diffractor bodies.
Sonicona was contracted by the CTBTO to explore processing methods for the development of on-site inspection resonance seismometry capabilities. The motivation is to analyse seismic earthquake recordings to detect any anomalies indicative for a cavity or rubble zone caused by an underground nuclear explosion. We have developed the onset-delay method for strong earthquakes from regional to teleseismic distances. The underground disturbance will change the plane wavefield of seismic body waves. This change depends strongly on dimension, shape and depth of the rubble zone/cavity, and on all geological contrasts (heterogeneities, faults, etc.) in the surrounding area.
The data set from Source Physics Experiments 5 and 6 at the Nevada National Security Site, the former Nevada Test Site, was used for the performed analysis. We were able to identify six anomalies which are mostly related to geology, and man-made underground structures, e.g. tunnel systems. Excluding these anomalies by external knowledge one anomaly remained, and could be related to the former UNE at Tiny Tot performed in 1965.
An on-site inspection (OSI) could be triggered in any environment. Environmental factors affect not only the signatures and observables of a nuclear explosion, but also the performance of the inspection team. Pursuant to the work on the OSI Action Plan 2016-2019 Project 1.5 on Operationalization of OSIs in Different Environments, this poster summarizes typical nuclear test explosion scenarios, key elements to be considered when developing OSI capability for different OSI environments and identifies challenging environments, i.e. mountainous areas, high seas, extreme climates and high/densely vegetated areas. Given the significant challenges that lie ahead for the development of OSI capabilities in challenging environments, it is envisaged that with the support of States Signatories and the involvement of the scientific community, a comprehensive OSI capability with strong climatic and environmental adaptability will be established.
Fires are still a large concern in several countries due to the social, environmental and economic damages they cause. Computer vision is focusing a lot of their attention on improving object detection as one of the fundamental challenges in the field. The aim of this research is to propose a system to predicate or to detect a fire accident in an on-site inspection area in real time using a monitoring system using a deep learning based approach.
An important aspect of on-site inspection (OSI) is a rigid chain of custody, where the risk of human error is minimized. Since 2016, FOI has conducted four field campaigns in Kvarntorp (Sweden), with the aim of understanding the radioactive xenon background in uranium-rich soil. During these campaigns, we have developed methodologies for sample collection and analysis. During the most recent campaigns, we utilized Wi-Fi controlled soil gas samplers with automatic logging of relevant quality parameters (e.g. pressure, flow, volume, CO2, O2, Rn). Sample information and metadata are transferred to a digital field protocol and written to a RFID-tag when the collected air sample is compressed and transferred to a pressurized bottle. The SAUNA Field system, used for processing and analysis of the samples, has been equipped with a multi-sample inlet module that holds six bottles. The inlet automatically reads the information from the RFID tags, making it easy to start a measurement by selecting samples in the system GUI. This minimizes the risk of human error as well as ensures a tracking of the samples and all associated data. In this work, we aim to present lessons learned and how this could help to improve the chain of custody in an OSI context.
A chain of custody (COC) is required in many laboratories that must analyse forensic materials and also to assure reliability of reported results. Tracking a reliable COC during an on-site inspection can be laborious but it is mandatory as it must assure that the evidence is authentic and traceable. The sampling methodology of the material, its transport and all the analysis steps as well people involved must be reported as information on the COC or chain of evidence. Although there is no limit to the number of transfers, it is crucial to keep this number as low as possible. A new solution for the future will be proposed: sample based COC, which could be innovative to eliminate problems with duplicates and/or splitting samples; location based COC based on RFID radio frequency identification; container based COC using electronic data key that can additionally detect any breach of integrity (evidence tampering).
Xenon spikes measurement is a key factor for Xe equipment calibration and a quality assessment/quality control programme for noble gas cluster. It is very important to have four Xe isotopes at the spike samples (Xe-131m, Xe-133, Xe-133m, Xe-135). The spikes are produced by contractors organized by the Comprehensive Nuclear-Test-Ban Treaty Organization. Unfortunately, due to very short half live of Xe-135 and small amount of Xe-133m in the spikes together with long spike delivery time and possible customs problems it is impossible to properly perform ROIs calibration for that isotope on site or even on the manufacturer's premises. Thus, now for detector calibration only Xe-133 and Xe-131m are widely used. Miscalculation during Xe efficiency calculations and ratios errors lead to some difficulties, such as false positive events, false alarms, etc. This poster shows new developments and first results of Automatic Xenon Generator based on 252Cf. This technique allowed to obtain xenon activity about 1000Bq, which was used for preparation of calibration sources by precision dosing method. The spikes are possible to milk at the Vienna International Centre or on site. The device is portable and has a user friendly interface for operation in automatic or manual modes. The poster will include all calibration process steps for different Xe equipment based on the four Xenon method.
Radiation early warning systems (REWS), which continuously transmit information about the level of radiation background, volumetric activity of radionuclides, meteorological conditions and other measured parameters to the central control post, can significantly reduce the risks of radiation threats to the population in controlled areas. The structure and composition of the existing REWS in European countries is analysed, the main principles for designing modern REWS are summarized and formulated.
The results of the development of advanced alpha, beta and gamma radiation monitors for detecting radioactive contamination in the atmosphere and determining their activity are presented. Registration of gamma radiation in monitors is carried out either by a SrI2(Eu) scintillation detector (Ø1.5”x1.5”, ΔE≤3% on E= 662 keV) or an HPGe detector (30%) with Stirling cooler (ΔE≤2.0 keV on E=1332 keV), interchangeable without any monitors structural changes.
To determine the concentration of alpha and beta radionuclides, the monitors use two independent spectrometric silicon detectors having an area of 600 mm^2 (ΔE≤25 keV on E=5.5 MeV). The second spectrometric channel is used for real time compensation of background gamma radiation in the beta energy range. The development results of big REWS based on alpha, beta and gamma advanced monitors (including for iodine and water) are presented.
Noble gas processes are based on the use of adsorbent materials. Among them, silver-exchanged zeolites are the most efficient for Xe extraction. Before the use of a material in processes, it is crucial to ensure its performance, especially when use in systems operating 24/7. Ag-ZSM-5 (one of the studied exchanged zeolite) is an efficient candidate to enrich Xe, but long term stability seems to remain a topic.
In our study, we assay to elucidate the relation between the structure of the material (Ag-ZSM-5 from different suppliers and synthesis) and the property (xenon uptake and deactivation). In our study, we investigated structure parameters that influence the properties that were already referenced in the literature. We expect the identification of the ideal structure will enable determination of the best adsorbent to use in a process.
Lessons learned from 20 years of operation of the International Monitoring System for aerosol sampler/analysers have pointed the way to a new generation of aerosol monitoring equipment. Using new technology and following the sampling/analysis scheme developed for next generation xenon systems, new aerosol equipment will be more resilient in power failures, will give much better location in analysis, and perform better for very low signal levels and very high signal levels. Electrostatic collection uses less power per cubic meter sampled, and electrostatic collection efficiency could be reduced remotely for extremely high signal levels. A pair of gamma radiation detectors will deliver the required minimum detectable concentrations, but in shorter sample times (e.g. 8 h vs 24h), and a new method for packaging samples could give better field to laboratory agreement. Progress on new technology will be presented.
Gamma-gamma coincidence techniques are known to improve the detection of particulate radionuclides relevant for Treaty monitoring purposes. To that end, Pacific Northwest National Laboratory (PNNL, USA) has developed a novel γ-γ coincidence analysis and radionuclide quantification software package. The software’s execution has been tested for radionuclides relevant to Comprehensive Nuclear-Test-Ban Treaty and other radionuclides with complex decay schemes. This presentation discusses software’s details, challenges encountered in its development, and its experimental validation. The validation was performed by experimentally measuring 15 radionuclides (including Ba-140, La-140, and Y-88) using the Advanced Radionuclide Gamma-spectrometer (ARGO) located in the Shallow Underground Laboratory (SUL) at PNNL.
GBL15, the UK’s noble gas certified Comprehensive Nuclear-Test-Ban Treaty Organization's radionuclide laboratory, supports the International Monitoring System (IMS) through the measurement of environmental radioxenon samples using beta-gamma coincidence spectrometry. GBL15 currently utilizes a system comprised of NaI(Tl) photon detectors and plastic scintillator electron-detectors in a SAUNA system to measure coincident emissions from four radioxenon isotopes of interest: Xe-133, Xe-135, Xe-131m and Xe-133m. A high-resolution electron-photon coincidence detector system comprising of high purity germanium (HPGe) detectors and a PIPSBox detector demonstrates improved discrimination between signals and less interference compared to the current system, although with a lower detection efficiency. Here we present the case for a HPGe-plastic beta-gamma coincidence detector system, which can demonstrate improved selectivity, but with greater detection efficiency. The minimum detectable activities for the radioxenon isotopes of interest have been quantified with various levels of interference.
As the new generation of International Monitoring Systems (IMS) stations radioxenon are being deployed, the radionuclide laboratories are seeing an increase in samples sent for reanalysis. These samples include both radioxenon spikes and environmental backgrounds. Additionally, the sample volumes have increased with the new systems to further improve the minimum detectable concentrations. We have developed an upgraded radioxenon laboratory system that is capable of automatically processing four samples from a new generation IMS stations in series. The system leverages technologies implemented within the new generation of IMS stations to allow for better state of health tracking and automated measurement without the need for liquid nitrogen for the gas processing. In this presentation, we outline the design of the modernized U.S. radionuclide laboratory system and detail the improved capabilities for sample measurement. Additionally, we highlight the potential for increased sample verification with the updated processing and measurement system.
Prior efforts in developing a silicon beta cell as a potential improvement to scintillating plastic in beta-gamma systems resulted in a design with high resolution, capable of resolving different conversion electron energies in the metastable radioxenons, but a higher threshold that cut off low energy X rays. Switching from a photomultiplier tube based NaI well detector to a low-voltage silicon photomultiplier (SiPM) based detector resulted in a significant improvement to the energy threshold from nearly 45 keV to below 30 keV. When combined with changes to the high voltage bias supply, energy thresholds on the beta cell have been reduced to below 22 keV. The SiPM well detector also drastically improved run-to-run reliability by eliminating noise dependence on beta cell positioning.
A SAUNA xenon lab system is operated at the Swedish xenon laboratory. The system has been running since 2009 and is used for a large variety of samples, ranging from subsoil samples with stable xenon volumes as low as 0.1 ml up to samples from the SPALAX-NG systems with a xenon volume of up to 6 ml. The high dynamic range of sample sizes puts special demands on the performance stability and calibration of the system. A good characterization of the system for all parts of the process is important to be able to minimize measurement uncertainties. This includes determination of the transfer losses during the sample process, GC-calibration for a wide range of xenon volumes and detector calibration for different sample sizes. Today the system is used for multiple purposes including participation in proficiency test exercises, measurement of samples from field exercises, re-measurement of samples from the Swedish xenon IMS-system and measurement of samples during development of new systems. The system and its calibrations together with results from some of these measurements will be presented.
In this study, we develop a new radioxenon detection system based on beta-gamma coincidence to improve the sensitivity of detection in laboratory analysis. The designed system is a new prototype that is tested by injecting the 222Rn and its daughters (214Pb and 214Bi) as a beta–gamma emitter and also Xe-131m gaseous sources, which are in the interest of CTBTO. Further, the system is calibrated and checked using a Ho-166m source. This system consists of a NaI (Tl) as a gamma, X ray radiation detector, and a Silicon detector for the beta or the conversion electron detection. Silicon is used for improving energy resolution and minimizing the potential for memory effect compared to the scintillator detector. The list-mode multi-parameter data acquisition system is used to set up coincidence parameters. The efficiency of two detectors and MDA are obtained.
Field studies of underground gas migration are an essential component of ongoing research to better understand how radioactive materials produced from underground nuclear explosions transport through and escape from the subsurface. Considerable research has been made in the past on understanding long term gas migration behavior driven by natural processes at field sites, but in recent years a greater focus has been placed on studying material transport driven by explosive events in early time. Scientists at Pacific Northwest National Laboratory are planning one such field experiment, which will utilize an injected radioactive xenon tracer driven by pressurized air in the subsurface to map out early time tracer arrivals at various sampling locations. As a critical component in this planned experiment, PNNL has developed a simple but robust xenon quantification system that allows for real time gas monitoring in the field as well as rapid switching between various sampling intervals. Here, the system concept is summarized, and evaluation is made of expected xenon detection sensitivities with respect to operational variables like sampling time, sample rate (i.e. gas pressure), temperature and radon background.
The radionuclide network of the International Monitoring System (IMS), operated by the Comprehensive Nuclear-Test-Ban Treaty (CTBT) Organization, is comprised of particulate and noble gas analysis. A variety of spectroscopic techniques have been developed and are in use to identify radionuclides which may be indicative of a nuclear explosion. Ongoing efforts to increase the sensitivity of measurements and reduce detection limits have seen the development of more advanced techniques. These include gamma-gamma coincidence spectroscopy to measure low level particulate samples and high resolution beta-gamma coincidence spectroscopy for radioxenon measurements. This poster will present a review of the latest published information on spectroscopic techniques in use and being developed in support of nuclear explosion monitoring.
Operated by the Comprehensive Nuclear-Test-Ban Treaty Organization, the International Monitoring System is used by almost 200 nations to monitor for nuclear weapons tests. After more than 20 years, the network is mostly complete, however the technology utilized for the particulate monitoring component remains practically the same, despite a number of laboratories developing coincidence systems that can offer orders of magnitude improvements in detection sensitivity and reliability. Here we describe the status of the technology, and the advantages of implementing this within the International Monitoring System. Furthermore, the performance of a prototype system developed by the Comprehensive Nuclear-Test-Ban Treaty Organization is presented.
Creare has designed, built, and tested an electrostatic precipitator (ESP) collection system for integration into the next generation of monitoring stations, RASA 2.0. The new system design has several significant improvements including advanced detectors, increased particle collection efficiency, lower power consumption, and potentially shorter collection times. Our advanced two stage ESP collects atmospheric aerosol particles on a flexible media. The particle-laden material can be folded compactly and presented to a coincidence detector system unlike the impact filter material used on the current RASA system. View factor modeling showed a large improvement moving from a wraparound detector system to a sandwich coincidence detector system which can enable large reductions in sampling intervals. Our ESP system has previously demonstrated over two times the standard flow rate of the current RASA system at significantly low power consumption. At the nominal flow rate of the RASA system, our system consumes about an order of magnitude lower power while meeting CTBT requirements for collection efficiencies. We have demonstrated reliable material handling in a full-scale lab-based prototype, with up to one years’ worth of material loaded on the system. Our system also demonstrated on-the-fly changes of collection efficiency and flow rate.
The ability to evaluate the concentrations of radioactive noble gases in the atmosphere is very important because it represents a powerful early warning tool in the event of a nuclear accident. Since there are currently no laboratories capable of carrying out this type of assessment in our country, a specific system has been set up for sampling, concentration, separation and analysis of radioxenon in atmospheric samples with the aim of acquiring skills and abilities that could be transferred to the authorities responsible for radiometric surveillance. It will also allow us to assist control activities and alarm the population in case of a nuclear accident. In order to correctly evaluate an anomalous detection of radioxenon, it is it is essential to know the value and the trend of the environmental background in the area of interest. Due to low concentrations, our efforts have been directed mostly at the reduction of the MDA of our measurement system, trying to improve the sensitivity and reduce the background noise. The methods used and the precautions applied to obtain ever more accurate measurements will be explained. Future plans for improvements will be also presented.
Recent reports have shown that unusual xenon radionuclides can show up in grown based xenon detection systems that lead to false hits on all four of the regularly monitored radionuclides (Xe-135, Xe-133, Xe-133m, Xe-131m). Potential sources of these unusual radionuclides could be high flux neutron sources or spallation neutron sources that are operated in the vicinity of current ground based collection systems. The development of detection algorithms for these unusual xenon radionuclides is needed to prevent false positive detections from diverting analyst time and resources away from potential legitimate radionuclide detections. The Idaho National Laboratory Noble Gas Laboratory has developed the capability to produce Xe-127. This radionuclide has never been utilized in an interlaboratory comparison exercise. This work carries out a limited interlaboratory comparison exercise to demonstrate the potential for introducing Xe-127 and other unusual xenon radionuclides into future laboratory comparison exercises.
The world´s first radioxenon array has successfully been installed in Sweden. The array consists of five detecting sensors (SAUNA Qb) which have been in operation for about two years. Compared to other radioxenon systems, the units are less expensive and easier to install and maintain. The cost of the entire array is similar to one single state of the art system. The individual sensors are less sensitive than a state of the art radioxenon system, but when combined, the aggregated verification performance can be high. The measurement units in the array cover a large area, which can lead to more detections and a potential improvement of source location and categorization. In this work, we will present results from the installations, operation, maintenance and the lessons learnt so far.
Radioxenon monitoring systems are a crucial component of the International Monitoring System (IMS) for the verification of the Comprehensive nuclear-Test-Ban Treaty. They monitor the atmosphere for potential xenon releases originating from nuclear tests. The efficient collection and purification of xenon from air is essential for their detection capability. The first systems in the IMS used pre-purification techniques to remove moisture and CO2 followed by activated carbon columns to collect and further purify Xe. In some new systems, silver-exchanged zeolites have replaced some of the activated carbon columns due to their much higher Xe adsorption capacity at room temperature. Recent studies on a new class of porous materials, namely metal-organic frameworks (MOFs), have demonstrated high Xe selectivity over other gas components although in conditions different than for IMS applications. In the framework of the EU JA VII programme, the potential use of MOFs for Xe collection and purification from air for IMS applications was investigated for the first time. This work has been further expanded over the last two years. The resulting thorough comparison of the Xe collection and purification ability and the underlying material characteristics of two MOFs, two silver-exchanged zeolites and a reference activated carbon will be discussed.
Atmospheric aerosol collection using electrostatic precipitation (ESP) promises power consumption roughly one order of magnitude lower than filter paper based collection methods. For monitoring radionuclides in the atmosphere, this translates to an order of magnitude improvement in system sensitivity for the same power consumed. Conversely, small portable systems are enabled with lower battery and solar power requirements for extended operation at airflow rates matching filter-based systems that are challenging to operate in the field for extended periods. PNNL is currently developing a small, portable ESP-based radio-aerosol monitoring system. This presentation will cover the system design, projected monitoring capability, and the current status of the system build and testing.
Xenon International is a next generation radioxenon monitoring system that was developed at PNNL and being manufactured at Teledyne Brown Engineering (TBE) to strengthen nuclear test monitoring and has recently completed Provisional Technical Secretariat (PTS) testing and was accepted as a qualified system for the International Monitoring System (IMS). Xenon International processes samples every six hours generating over 2.5 cc of xenon gas that is counted in a beta-gamma coincidence detector for 12 hours resulting in unprecedented detection limits for radioxenon isotopes. Phase 1 testing was conducted at TBE and Phase 2 was conducted at Schauinsland Germany at RN33 radionuclide station. Xenon International completed acceptance testing with >96.93% uptime, and routinely detected never-before seen radioxenon isotopes in an IMS station including Xe-125, Xe-127, and Xe-129m during Phase 1 of testing. This talk will discuss Xenon International performance during PTS testing and will discuss expected and unexpected radioxenon isotope detections.
This work presents the results of the development and experimental investigations of the sampling unit of the automated system for monitoring the Ar-37 content in the soil and the surface atmosphere. The adsorption and separation characteristics of materials for air separation and argon extraction from an air sample by the volumetric method have been studied. CaA (5A), PSA/VPSA (13XHP) NaX (13X), ZSM-5, and Ag-ZSM-5 zeolites, as well as coconut charcoal and activated charcoal were used as materials. Adsorption isotherms of the main air components (Ar, N2, O2) on these sorbents at room temperature in the pressure range from 0 to 10 bar were obtained. Based on the study, a two-stage system of pressure swing adsorption was created to obtain enriched argon. The argon concentration in the enriched flow was measured as a function of the filling, blowing, discharge, and regeneration times, sorbent types, and adsorber volumes.
A methodology of particle grouping based on a mathematical model enables to induce manipulation of the particle size distribution using grouping and coagulation which leads, in turn, to the increase of filtration efficiency and detection ability of monitoring systems.
This study was conducted as a combined theoretical and experimental one. The current prototype consists of an inlet of particles (diameter of microns and sub-micron), a wavy tube combining oscillations created by a rotating disk at the entrance. The purpose of this setup is to encourage the aggregation of sub-micron particles into larger particles to increase their filtering efficiency. The experimental results indicate particle grouping in the system. It is concluded that this concept can serve as a tool for enhanced filtering efficiency of the grouped particles and thus can lead to increased efficiency of the particulate monitoring system.
The CTBT-relevant radionuclides were selected using decay data from around 1998 to 2000. Now, about 22 years after the selection, we have available evaluated decay data that are even more reliable than they were at that time. The latest evaluated decay data (half-lives, decay modes, γ-ray energies, and their emission rates) for the CTBT relevant 96 radionuclides and background 49 radionuclides in γ-ray spectrometry such as particulate monitoring were compiled. For the data compilation, decay data evaluation project (DDEP) data, which contains the latest evaluated data for radionuclides with high applicability, and evaluated nuclear structure data file data for radionuclides without DDEP data available were used. The comparison of the compiled data set and the data used for the selection of the CTBT-relevant radionuclides showed relative differences of up to 36 % for the main γ ray emission rate, up to 10 % for the half-life, and up to 0.1 % for the main γ ray energy. In addition to CTBT-relevant and background radionuclides, a compilation of γ ray data for 18 radionuclides used for calibration and quality control measurements of γ ray spectrometer was also prepared.
The radioxenon system gain stability is critical to provide accurate measurements of the activity. There are several different methods for processing daily quality control (QC) beta-gamma two-dimensional (2-D) spectra for checking energy calibration. Fitting to a template seems the most correct method for monitoring the calibration stability. The possibility of using a template for a 2-D spectrum has been investigated. The template is created using locally weighted scatterplot smoothing (LOWESS) of a 2-D LONG QC spectrum (a histogram). It is necessary to smooth out the LONG QC spectrum histogram. Daily fitting of the template to the QC spectrum using the weighted least squares method gives the “tweaking” factors of the gain shift. A number of calculations have been carried out based on the results of measurements of the MIKS QC source (Cs-137 and Ba-133). Different approaches have been compared. The template-fitting method provides the most stable results even with poor statistics.
One of the most important issues in radionuclide monitoring technology is the analysis of radionuclide data of the International Monitoring System (IMS). The number and concentration rate of CTBT-relevant detected radionuclides determine the possibility of a nuclear event occurrence in radionuclide monitoring. Fission and activation products being on the standard list of CTBT-relevant radionuclides, decay to nuclides which are stable or unstable. Some of these radionuclides decay to radionuclides which are also among the 83 CTBT relevant radionuclides. Considering the different decay reactions of these radionuclides in mother or daughter states, estimating zero time of a nuclear event can be improved by discriminating these two decays. In this study, CTBT-relevant fission radionuclides that decay to unstable radionuclides, which are among the 83 CTBT-relevant radionuclides, are the focus and their decays over time are investigated using Bateman equations in mother and daughter, and the combination of these states. Then, the ratio of concentration of intended radionuclides to fission radionuclides are obtained for two scenarios including radionuclides detection in daughter state or combination of mother and daughter. Finally, the occurrence probability of nuclear event and its time are evaluated in these two scenarios.
Measuring radionuclide releases from underground nuclear explosions is extremely challenging given the likely low release and increasing background releases. This becomes even more challenging when measurement uncertainties are not fully understood and can make decisions based on measurements questionable. The work presented here will lay out a holistic uncertainty model that includes terms ranging from the initial detector calibration to physics constants to counting statistics. The uncertainty framework is applicable at low counting statistics near or at background levels and, although the uncertainty framework is based upon United States developed systems, it will be applicable to a broader range of systems.
Environmental radioxenon measurements for nuclear explosion monitoring use the net count method to determine activity concentrations and rely on detector sensitivities for each of the radioxenon isotopes. This method requires that a detector background measurement be made to account for environmental radioactivity and is only performed during the initial install or during a recalibration. Currently, there are no other standards for how often the detector background should be updated. We report on first-year observations of the backgrounds. The count rate of individual detectors and coincidence counts provide an important metric for identifying anomalies and the causes that could impact both the minimum detectable concentration of the system and the accuracy of the activity concentration measurement.
Isotopes of radioxenon are prevalent in the atmosphere and present a “background” signal which can make radioxenon detection analysis and event reconstruction analysis more difficult. In recent years advances in software have allowed for Bayesian reconstruction techniques to be applied to radionuclide detections to determine the source parameters. This work focusses on the effort to reconstruct radioxenon sources amongst a high background signal using novel event analysis techniques and inverse atmospheric transport modelling simulations. Here we define a background signal based on estimated nuclear reactor releases, combine a “real” signal, then reconstruct the source parameters using a software tool known as FREAR.
There are several methods for analysis of radioxenon beta-gamma coincidence spectra. The most important approaches to estimating the activity using beta-gamma coincidence spectra are the Net Count Calculation (NCC) and the Standard Spectrum Method (XeMat and ROI simultaneous fitting method). Within these approaches, algorithms differ in many ways and can provide different estimates of the activity. A comparison of different methods and algorithms has been carried out in this study. Activity and MDC calculations using different methods were performed and the results were compared. Evaluations have been made using real data from the MIKS. It has been shown that if the count rate is high, all methods are consistent and similar, but if the statistics are poor, simultaneous fitting and matrix algorithms using a priori information are preferable.
Samples from International Monitoring System (IMS) stations which contain multiple CTBT-relevant radionuclides with abnormal activity concentrations (Level 5 for particulates) are sent to IMS radionuclide laboratories for further analysis. Since a reanalysis of spectra at IMS radionuclide laboratories might enhance the reliability of analysis results, it is proposed to investigate a method that can be used to identify potentially associated samples (that are most likely not categorized as Level 5) at the same or at neighbouring stations that may have received radioactivity from the same release event as the Level 5 sample. We investigate the methods to associate the Level 5 samples at an IMS station with the other samples at the same and at neighbouring stations using the source-receptor sensitivity (SRS) fields produced routinely for each sample by atmospheric transport modelling (ATM). This investigation can help to determine a suitable method for adding associated samples to the Standard Screened Radionuclide Event Bulletin (SSREB) of a Level 5 sample. This method can also be used to implement the triggering condition for sending samples to IMS radionuclide laboratories which may contain radioactivity from the same release event.
For the enhancement of International Data Centre products, specifically, the Standard Screened Radionuclide Event Bulletin (SSREB), an important step is to establish methods to associate the detections of CTBT-relevant nuclides in different samples with the same release to characterize its source for the purpose of nuclear explosion monitoring. Episodes of anomalous activity concentrations at the International Monitoring System (IMS) radionuclide stations are the first guess for being related to the same release. For multiple isotope observations, the consistency of their isotopic ratios in subsequent samples with radioactive decay is another plausible hint for one unique release. We show case studies of consecutive Level C samples and prior/post Level B samples used to demonstrate the effectiveness of sample association, whereas atmospheric transport modelling (ATM) is applied to identify the air masses that link the release to multiple samples. This approach forms the stepping stone to defining analysis procedures and criteria for automatic sample association for the SSREB, which is relevant for expert technical analysis.
The isotopic ratios of radioxenon can be useful for discrimination between CTBT-relevant radioxenon detections and civil nuclear facilities. In this presentation, the isotopic ratio distributions of emissions from the civil nuclear facilities, are evaluated first in order to test and demonstrate this methodological approach. Second, the source-receptor sensitivity fields calculated operationally with atmospheric transport modelling are utilized to determine the atmospheric transport time distributions between these facilities and the International Monitoring System (IMS) radionuclide stations. Then, the isotopic ratio distributions that can be expected for measurements at IMS stations can be calculated by folding these two kinds of distributions (emission and atmospheric transport time) while applying the radioactive decay equations. Finally, we compare these calculated isotopic ratio distributions of measurements at IMS stations with the real isotopic ratio distributions of measurement at IMS stations. This investigation can help to develop methods for screening by distinguishing between normal (based on known sources) and anomalous isotopic ratios. It may also be useful for discrimination between CTBT-relevant radioxenon detections and estimated observations based on emissions from known nuclear facilities as part of the effort of developing a Xenon Background Estimation Tool (XeBET).
The International Monitoring System radionuclide particulate systems of the CTBTO network are equipped with one of the following three system types: CINDERELLA, RASA and manual. Each system commonly involves sampling, decaying and acquisition processes. Whereas the entire processes are automatically conducted in both the CINDERELLA and the RASA systems, ensuring high reproducibility of the measurement geometry, manual systems require human interventions. During the acquisition process, specifically, the decayed sample is manually placed on the detector endcap by operators. Since a sample holder is not necessarily used at the stations, the repeatability of the measurement geometry can be compromised by the mispositioning of the sample, with the consequence of introducing bias in the calculated activity concentrations. This study aims at investigating the impact of variations in the sample positioning on the calculated activity concentrations of radionuclides in measured samples, using Virtual Gamma Spectroscopy Laboratory (VGSL), a Monte Carlo based simulation tool. Possible scenarios were simulated to investigate whether variability in sample positioning has a noticeable impact on the isotope quantification reported in International Data Centre products.
A new goal for radionuclide monitoring is inspired by what is done with waveform processing, to automatically create a radionuclide event bulletin, synthesized from multiple measurements across the radionuclide network. Network measurements are sequentially processed through detection, association, and finally assimilated in an event, which is then documented within a bulletin with supporting data products, ready for an analyst to review. The progress on this development will be reported.
Radioxenon is used to identify underground nuclear explosions by quantifying the amount and isotopic ratios of Xe-135, Xe-133, Xe-133m, and Xe-131m. Determining these concentrations requires knowledge of the detector performing the measurement and the accurate attribution of each measured decay. The current standard for estimating the activity concentration employs a beta/gamma coincidence histogram combined with 7 or 10 rectangular regions-of-interest (ROI). These ROI are specific to the detector type, based on the energy resolution and efficiency of the detector. We present an alternative method of count attribution which employs neural networks trained on simulated data to generate probabilistic assignment for each detected count. These networks are physics-informed and employ gaussian curve fitting to closely model the expected behavior of radiation detectors. This reduces the parameters (hundreds compared to hundreds of thousands in related work) and maintains explainability of the results. Our work demonstrates a method of incorporating machine learning into radioisotope detection that does not require faith in a black-box model.
The CTBTO International Data Centre (IDC) developed a novel Geant4 based Monte Carlo simulation software for HPGe detectors in use at particulate systems of the International Monitoring System (IMS). The software, dubbed GRANDSim (an acronym for Geant4-based RAdioNuclide Detector Simulation tool), allows to simulate both coaxial and planar detectors, and includes default definitions of standard measurement geometries and shielding configurations of the three technologies operated at IMS particulate systems. The software simulates efficiency calibration, isotopic response function and coincidence summing correction factors for any natural and anthropogenic radionuclides of interest. The physical model is automatically optimized by constraining simulation results against experimental calibration for non-summing energies.
Simulated entities are used as support parameters in the automatic processing of daily spectra from IMS for: (a) improving the quality of efficiency calibration (by including coincidence summing corrections), (b) enhancing the nuclide identification results (by including summation peaks) which reduces the workload on analysts in interactive mode, and (c) ensuring reliable activity concentration results by including required coincidence summing corrections when applicable. In addition, GRANDSim simulates gamma spectra for mixtures of radionuclides with any activity concentrations.
The contribution presents the key features of GRANDSim for HPGe detectors in use at particulate systems.
Recent studies emphasize neutron activation as a source of radioxenon emission which needs to be considered as contributing to the atmospheric radioxenon background. Since activation products have different isotopic ratios than radioxenon from fission, taking activation into consideration impacts on the determination of the origin of radioxenon detected by the International Monitoring System. The parameters that need to be used for simulations of the activation source include the neutron energy spectrum, the activation cross-section, neutron flux, irradiation time and retention time. Any difference in these parameters changes the resulting isotopic ratios. In this presentation all these parameters are investigated and the most suitable of them are introduced for future studies.
For the Source Term Analysis of Xenon (STAX ) project an experimental network to measure releases of radioxenon isotopes at the stack of nuclear facilities is being set up. The data are transmitted to a central server, where authorized users can retrieve raw data from the STAX server or can view the data via a web browser. In this presentation an overview on the various interactive data viewing tools of the STAX software is given. The STAX software provides a dashboard overview of the operational status of the network and chart interfaces to view state of health data and isotope release data. Isotope release data can be viewed as time series of emissions or as isotopic ratio plots. For each individual data point, raw data can be downloaded from the data chart and the corresponding gamma spectrum can be viewed. Using results from atmospheric transport modelling (ATM), either in forward or backward mode, concentrations at International Monitoring System (IMS) radionuclide stations can be simulated in order to estimate the impact of emitting facilities on the concentrations measured at IMS stations.
The purpose of a radionuclide expert technical analysis (ETA-RN) is to assist States Parties to identify the source of a specific event. The output of an ETA-RN is a State Requested Methods Report (SRMR), which builds on routinely generated results from standard International Data Centre products like the radionuclide reviewed report and radionuclide laboratory report. The various functionalities in this ETA-RN software suite are all based on isotopic ratios detected in samples collected at International Monitoring System (IMS) network. The analysis modules include the event definition based on radioxenon detections at IMS radionuclide stations, calculations of isotopic ratios using different methods, sample association based on consistency analysis of isotopic ratio evolution, simulations of release scenarios using Bateman equations, event discrimination based on relationship plots of four/three radioxenon detections, event timing using a function of isotopic ratios over time and SRMR generation. These functionalities are demonstrated via typical case studies, such as announced DPRK2013 events, consecutive Level C samples, and observations during the Fukushima nuclear disaster. In this presentation, we outline the status of the software development as well as analysis methods related to isotopic ratios.
One of the main challenges for the Comprehensive Nuclear-Test-Ban Treaty Organization verification regime is to be able to discriminate anomalous signals generated by underground nuclear explosions (UNEs) from those generated by other sources like medical isotope production facilities (MIPFs) and nuclear power plants (NPPs). The general method can consist in a procedure starting with the assessment of activity concentration's “anomalous values” and ending with the proper interpretation of the isotopic ratios. The first step to achieve this goal should be to establish a proper definition of activity concentration's “anomalous values”. A possible statistical approach to address the issue of activity concentration's “anomalous values” against atmospheric background and to consequently suppose a proper definition of activity concentration's “anomalous values”, is suggested by the Italian National Data Centre – Radionuclides to accomplish the purposes of the CTBTO verification regime.
Activity ratios of CTBT-relevant isotopes can be used to discriminate nuclear explosion sources from releases of nuclear facilities and to determine the detonation time under assumed scenarios. The net signals of radioxenon isotopes and their associated uncertainties are estimated by the net count calculation (NCC) method. An alternative approach is the regression analysis such as the least squares fitting, enabling the deconvolution of X ray contributions from radioxenon isotopes and radon, especially for complicated spectra and low count level. The determination of the concentrations of these isotopes relies on a robust calibration method. This paper outlines a combined calibration procedure based on four radioxenon spikes. The output of the beta-gamma detector system can be read out as three measurement channels: beta-gamma coincidences, beta singles and gamma singles. All three channels detect the same number of radioactive decays in 4π measurement geometry. The detection efficiencies are determined by comparison among the numbers of counts from three measurement channels, without the need for a reference value of radioxenon activity. In this work, the activity values of radioxenon standard sources are estimated based on the NCC method first, then used for calibration of the regression analysis method.
Normal operational releases of radioxenon make the discrimination between radioxenon detections from civil nuclear applications and from nuclear testing a very complex task.
The objective for the short to medium term is to develop algorithms and tools that facilitate the understanding of the background. The longer term vision is to eventually develop robust methodologies for explaining to what extent radioxenon detections at International Monitoring System (IMS) stations can be explained based on the impact of civil sources. In this regard, observed radioxenon activity concentrations at the IMS noble gas sites in 2014 have been further reviewed. The update involved offline reprocessing of the spectral data from all beta-gamma coincidence based systems using the new configuration of net count calculation (NCC), which reduced the rate of false positives of Xe-131m, Xe-133m and Xe-135. This presentation compiles achieved results for observations at IMS stations. The statistical analysis of simulated vs. observed data is repeated and compared with the 2014 baseline that was set in a previous published study (Gueibe et al., 2017). In addition, the new dataset has the potential to be used in other research areas, such as radioxenon isotopic ratio studies.
Recent developments of noble gas systems include new detector technologies. They exhibit very low background count rates. In this case, radioxenon signal as low as a few counts per day is expected. Therefore, for such low count measurements, classical Currie law estimation for measurement detection threshold and detection limits are not accurate enough. In this context, the CEA/DAM implemented several algorithms (matrix inversion, iterative process) and keeps on the effort to improve the data analysis with innovative tools, such as spectral unmixing. Due to low statistics, it is not convenient to test and to compare these algorithms on measured low level spectra of radioxenon, therefore a Monte Carlo simulated database of spectra was generated, for several detection configuration (high resolution beta/gamma spectra, low resolution gamma/high resolution beta spectra, and low resolution beta/gamma spectra). To optimize the analysis of this database, the spectral unmixing algorithm was ported on Graphic Processing Units (GPU), leading to a drastic decrease in computation time and allowing for the processing of large simulated data sets in a reasonable time frame.
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
E-poster session with display of each e-poster on an assigned touchscreen
The CTBTO receives, collects, processes, analyzes, reports, and archives data from International Monitoring System (IMS) facilities and makes it available to authorized users. Automatic processing methods and interactive human analysis are done on a routine basis to raw IMS data in order to produce and archive standard International Data Centre products.
The primary purpose of the IMS data is for nuclear explosion monitoring. However, these data can also contribute to climate monitoring and climate change research, ultimately promoting the resilience of communities and their ability to resist/adapt to climate change. The CTBT and its monitoring data may create opportunities that could help address these issues.
Data from the IMS nuclear monitoring network can provide practical steps that can be used by States Signatories and international organizations to protect the planet and its people. Climate change is the example used for this presentation to discuss the opportunities. It reviews scientific applications of IMS data that have successfully been associated with contributing to climate research and climate change monitoring so far. Based on these positive examples, this research makes suggestions on how to further implement the use of IMS data to enhance climate change studies and how this data can continually contribute to monitoring climate change.
In recent years, the connections between climate change and security have drawn a lot of scholarly attention. Climate change adaptation measures frequently fall short of completely integrating peacebuilding or conflict prevention aims, making already vulnerable communities further impoverished and less able to withstand interconnected climate and security problems. The CTBT, which promotes a ban on nuclear tests, was opened for signature in 1996. The International Monitoring System (IMS) is a significant part of the treaty’s verification regime and provides a wealth of data that may be utilized for a range of purposes, including climate change research and catastrophe warning and mitigation. The IMS stations gather signals from a wide range of natural phenomena while continuously searching the world for indicators of nuclear explosions in violation of the CTBT. The information may also be used by institutions and scientists from all 183 CTBTO Member Nations for scientific research on climate change and catastrophe warnings.
To begin with, the paper will explore the correlation between Climate Change and Global Security and will sketch out prospective contributions CTBTO can make to the issue. Furthermore, analysis of prospects for the IMS data for confidence-building measures for the non-signatory states will also be discussed.
A success tsunami early warning issued to the public just after the occurrence of M7.5 earthquake located in the Flores Sea-Indonesia on 14th December 2021. Rapid seismological observation involving 6 CTBTO seismometers in Indonesia, 440 Indonesian Meteorology Climatology and Geophysics Agency (MCGA) broadband seismometers, 5 borehole seismometers and 167 neighboring countries seismometers. Following the Indonesia Tsunami Early Warning System (InaTEWS) protocol the warning set into four mechanism which urged the inhabitant to evacuate at the highest warning level.
We produce an automatic shake-map by means of intensity level map based on 31 strong motion accelerometers recording around the epicenter. We conduct advance earthquake relocation procedure to the initial catalog comprising 750 aftershock events until 8 days after the mainshock. Earthquake migration over geographical longitude indicated that four earthquake clusters exist. Full moment tensor inversion depicted that the source mechanism dominated by strike-slip movement, however several thrust mechanisms are also observed. We presume that the dextral movement of mainshock is the major cause of the minor tsunami. Then, by using seismo-statistical approach, we predict the aftershock productivity will be end by means of producing zero aftershock at 45 days after the mainshock.
The West African region has experienced devastating earthquakes in historical and recent times. The seismic activities with magnitudes ranging from 1.4 to 6.5 have been observed and recorded in the region. This includes the December 22, 1983, Mw=6.3, Guinea earthquake; 275 people were killed, more than 1000 were injured and 18,000 people were rendered homeless. The June 22, 1939 earthquake M=6.5 that occurred in Ghana where 17 lives were lost and many structures were destroyed. In recent times earthquakes of magnitudes ranging from 1.4 to 5.3 have been recorded in Ghana, Ivory Coast, Niger, Mali, Sierra Leone, Cameroon among others. Seismic data received from the International Data Centre is utilized in compiling earthquake catalogue for the region. The seismic events are mostly associated with the Romanche, Chain, St. Paul transform faults and the Cameroon volcanic line. The study is focused on the recent earthquake activities in the West African region and is designed to raise awareness on the rising seismic activities in the region and awaken relevant authorities, organizations/institutions, and the general public on the need for timely and sustainable proactive measures for mitigating the risks and disasters associated with earthquakes in the region.
National Data Centres are the national technical organizations competent in advising their governments on the verification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Between the mains elements of the NDCs highlight the waveform analyst staff, playing an important role in the CTBT's verification architecture. In this sense, its permanent training is key for the development of capacities and for the next generation's learning curve reduction. Since 2021, a new training program for waveform analysts of the Venezuelan Seismological Service starts with a career profile, which allows the participants in a first stage develop in synchrony, and in the least time possible, the abilities and skills necessary for seismic analysis events process, based on knowledge of the context where these events take place, and training on the protocols and procedures inherent to the analysis process and its standardization, raising the level of staff through the activities leading to their certification. This work shows, within the framework of the regional capacity building of the NDCs, how the training program in mother tongue allows preparing the bases of certified analysts, to become waveform analysts of the National Data Center in Venezuela and contribute to the capability mission and understand the roles of NDCs in the verification regime.
The purpose of the panel on “Introduction of professional networks promoting female and young experts in STEM and cooperation with CTBTO” is to introduce the audience to networks and activities that are represented on the podium and to inform about their CTBT-related activities supporting the engagement and integration of youth and female experts in the CTBT experts’ community.
There is no discussion expected but Q&A will be permitted.
Tsunamis have a long history of devastation, causing more than 250,000 deaths worldwide during the last two decades. Current rapid warning systems rely on the earthquake magnitude mainly resulting in over three-quarter of alarms being false. Aiming at reducing false alarms we have developed a complementary real time early tsunami warning system methodology. The methodology is based on analysing acoustic signals under the effects of gravity, known as acoustic gravity waves (AGWs), that are generated together with the tsunami. The signals travel at the speed of sound in the medium which far exceeds the maximum phase speed of the tsunami. AGWs carry information about the source which is recorded by remote hydrophones (underwater microphones). Analysing these recordings in real time, requires solving both the inverse and direct problems, which are the main strength behind the proposed warning methodology. In addition, we use machine learning to classify the type of earthquake mode of strike. In this talk I shall discuss various AGW theories and applications, both fundamental and applied, and how analysing CTBTO hydrophone data contributed to the development of new mathematical models. Attention will be focused on the real time early tsunami warning system methodology that has been further developed into an operational software.
Timor-Leste is a small country that lies along the Pacific Ring of Fire. According to Think Hazard, the entire country's exposure to earthquakes is high, which suggests that there is more than 20% chance of potentially-damaging earthquake shaking in the country in the next 50 years. Moreover, as a country surrounded by about 706 kilometres of coastline, it also faces the threat of tsunamis. In the last century, three tsunami episodes have been recorded along the country's northern coast. Currently, Timor-Leste is partnering with Australia and Indonesia to monitor earthquakes and tsunamis. However, issues with connectivity with neighbours' warning agencies have made it challenging to monitor earthquakes or tsunamis in real time. This weakness will seriously affect Timor-Leste's ability to provide timely warnings to evacuate or temporarily shut down facilities.
Timor-Leste, through the Civil Protection Authority (CPA), aims to strengthen its early warning system by rapidly acquiring and disseminating critical information on significant earthquakes that can potentially cause a tsunami. Through strong collaboration with CTBTO, the CPA would like to leverage the predictive analytic capacity of the International Monitoring System to issue timely warnings to save more lives.
The geographical location of Bangladesh is at the northern most part of the Bay of Bengal having a funnel shaped ending at the southern coast. Tectonically, the region is at the junction of the Indian and the Eurasian plate making an active tectonic zone of the world. Therefore, possibility of earthquake in the ocean is likely to occur in any place of the region that can result to tsunami at the area. Bangladesh is hosting an International Monitoring System (IMS) station (AS007, BRDH) and a National Data Center (NDC-BD) since 2011, and hence invited to establish a Tsunami Warning Agreement (TWA) with the CTBTO, thus benefiting from relevant IMS data to reinforce and complement early tsunami warning system. However, only the TWCs recognized by the Intergovernmental Oceanographic Commission (IOC) of UNESCO are eligible to enter into such TWA with the CTBTO. Bangladesh Meteorological Department (BMD) is recognized by IOC-UNESCO for issuing tsunami warning information in the country. Therefore, the NDC-BD has the opportunity to initiate the proposal to establish a CTBT-TWC in Bangladesh with the technical collaboration with BMD, where present NDC-BD users will be able to utilize the TWC data for scientific analysis and interpretation.
The aim of this presentation is to share experiences and lessons learned from our first year of operation. From build-up and establishment challenges, data products and reports, detection levels trends and data comparison betweenthe International Data Centre (IDC) and the National Data Centre for testing and capacity building purposes for analysis. Three radionuclide stations from Argentina plus one from Chile are being monitored on a daily basis using the tools provided by IDC for CTBTO relevant radionuclide detection in the region.
CTBTO has an active program for capacity building, to increase the ability of scientist and operators from signatory countries to process and interpret data. While most of the courses are carried out in Europe, a few regional training courses have proven to be excellent ways to integrate people from countries in that region. Increasing the number of these regional courses but also to having these courses in parallel with political courses or meetings, with diplomats and politicians from the countries participating in the scientific meetings and training courses is suggested. This way awareness could be increased among politicians on the importance of having an active involvement on CTBTO issues.
Integrating countries at a regional level could allow politicians to see the advantages of full membership and could trigger some countries to re-establish payments and others to start contributing to CTBTO. A regional political integration of countries, for instance from the Caribbean or Central America, among other regions, could trigger proposals to create regional funds to pay CTBTO fees.
Scientists should provide technical information to politicians and diplomats from their own countries activating the pillars of science diplomacy: science for diplomacy, science in diplomacy and diplomacy for science.
Sudan has signed and ratified the Comprehensive Nuclear-Test-Ban Treaty (CTBT) on 10 June 2004, making Sudan 172nd State Signatory. The establishment of a capacity building system (CBS) in Sudan is helpful in integrating data acquisition, processing and analyzing using the NDC in a box. The different training courses and participation in workshop and conferences play an important role in the educational process using different software included in the NDC in a box package (Geotool, DTK GPMCC, SeisComp3 etc.). Priority is given to data analysis to develop capabilities to detect nuclear explosion tests and to use the International Monitoring System data, International Data Centre products and other CTBT relevant information that benefit research applications for civil and scientific purposes. The unlimited support of the CTBTO is a crucial factor that keeps the NDC working well, receiving new sets of batteries to the CBS and Global Communications Infrastructure equipment is appreciated. The resilience of the COVID-19 pandemic affects many activities, it had negative impacts on the contribution to the different CTBT activities as well as technical visits that were scheduled at the end of 2019.
The 1st Nuclear Explosion Signal Screening Open Inter-Comparison Exercise 2021 was conducted from the end of 2021 to the end of 2022. The exercise evaluated different screening methods and procedures to identify radioxenon detections not consistent with the radioxenon background from civil nuclear facilities. It was based on a subset of a radioxenon test data set produced for the whole year of 2014 with hypothetical nuclear underground and underwater explosion signals added to IMS observations.
The exercise considered three levels of participation requiring different levels of expertise: 1) Level 1 (basic, ATM expertise only), for which participants provided simulated radioxenon background time series at the 23 IMS stations defined in the test data set to be used as input for screening based on a set of predefined metrics covering detection, screening, and timing powers; 2) Level 2 (ATM and/or radionuclide expertise), for which, in addition to Level 1, participants provided their own screening methods and results for detection, screening, and timing powers; and 3) Level 3 (higher-level ATM and statistical expertise), for which, in addition to Level 2, results were provided for location and magnitude estimates for a few selected test cases.
Final results of the exercise will be shown.
The CTBTO operates a worldwide network based on four complementary verification methods to detect any sign of a nuclear explosion conducted anywhere – underground, underwater or in the atmosphere. The radionuclide technology is the only one that can confirm whether an event is indicative of a potential nuclear test.
The four CTBT-relevant radioxenon isotopes are fission products. They are measured by stations equipped with noble gas capabilities and play a crucial role when determining whether an event is of CTBT-relevance. Specific measurement systems have been designed in the past decades to match the needs of the network using state-of-the-art technologies. In parallel, tailored analysis methods have been developed and implemented by the International Data Centre to make radioxenon measurement data more and more relevant for CTBT-purposes.
This is a complex task, as a highly variable radioxenon background produced by civil nuclear facilities is likely to interfere with the potential signal of a nuclear explosion, making the identification of CTBT-specific events a challenging task.
This work intends to provide an overview of 15 years radioxenon monitoring by the CTBTO. Current challenges are also presented, together with the potential of further advancing knowledge and understanding of the radioxenon background.
The CTBTO is setting up and operating a worldwide network of stations to monitor the globe for evidence of potential nuclear tests. Radioxenon plays a crucial role when determining whether an event is of CTBT-relevance. Specific measurement systems have been developed in the past decades and recently enhanced with a new generation of noble gas systems. Tailored analysis methods have been developed and implemented by the International Data Center (IDC) to make radioxenon measurement data more and more relevant for CTBT-purposes. This is however a complex task for the IDC. A highly variable radioxenon background produced by the civil nuclear industry is likely to interfere with the potential signal of a nuclear explosion, making the identification of CTBT-specific events a challenging task. New technologies like Molten Salt Reactor may complicate the situation further. There is still huge potential for refining the radionuclide and atmospheric transport methods to better understand the radioxenon background, to make screening more discriminative and to enhance location capabilities. The challenges of radioxenon monitoring will be reviewed and the potential of further advancing knowledge will be discussed.
Meteoroids larger than about 10 cm in diameter produce shockwaves upon entering dense regions of the Earth’s atmosphere, where the local Knudsen number corresponds to the continuum flow regime. Shockwaves generated by the hypersonic flight and fragmentation of meteoroids decay to low frequency sound (infrasound). Given desirable propagation conditions and sensor availability, meteoroid generated infrasound can be detected by microbarometers at large distances. As such, meteoroids serve as a natural laboratory for high altitude sources of infrasound and provide valuable ground truth information that otherwise would not be available. Earth grazers are a rare class of meteoroids; these objects enter at a very shallow angle and traverse the upper regions of the atmosphere, producing a luminous path spanning as much as several hundreds of kilometers. Unless they completely ablate or slow down to fall down to Earth, Earth grazers exit back into space at a slower velocity and an altered orbit. While direct observations of Earth grazers are extremely uncommon, detection of infrasound generated by such objects is even more scarce. We report infrasound detection and analysis of a rare horizon-to-horizon Earth grazer event that occurred over northern Europe on 22 September 2020. SNL is managed and operated by NTESS under DOE NNSA contract DE-NA0003525.
The 15 January 2022 Hunga, Tonga, volcano's explosive eruption produced the most powerful blast recorded in the last century, with an estimated equivalent TNT yield of 100–200 megatons. The blast energy was propagated through the atmosphere as various wave types. The most prominent atmospheric wave was a long period (>2000 s) surface guided Lamb wave with energy comparable to that of the 1883 Krakatoa Lamb wave; both were clearly observed by pressure sensors (barometers) worldwide. Internal gravity, acoustic gravity, and infrasound waves were captured in great detail by the entire infrasound component of the International Monitoring System network with periods from ~1 h to a few Hz. Such atmospheric waves and selected barometers near the source provide insight on Earth's impulse response at planetary scales. Infrasound waves (<300 s period) were seen to circumnavigate the Earth up to eight times. Seismic, hydroacoustic and tsunami waves were also clearly captured by the seismic and hydroacoustic components of the International Monitoring System network.
The explosive eruption of the Hunga Volcano on 15 January 2022 provided an ideal test case for reviewing established methods to analyse source processes, especially because the key task of discriminating different kinds of explosive sources such as a nuclear test and a volcano eruption can be challenging. Standard techniques were applied to analyse critical events in the frame of the CTBT, i.e. all three waveform technologies (seismology, infrasound, and hydroacoustic) and atmospheric transport modelling of radionuclides. The potential of standard analysis methods to discriminate a source were assessed. This paper shows that the methods applied work well to identify, investigate, and discriminate a critical event. During discrimination it was not only possible to exclude a shear-source (i.e. earthquake) but also distinguish the volcanic explosion in contrast to a human-made explosion. However, some tasks remain difficult with the available methods. These tasks include the estimation of the strength of a non-shear event and thereupon a yield estimation of a possibly critical event. In addition to evaluating these methods, the results were related with specific phases of the eruption process providing a more detailed insight into what happened.
A propagation model for atmospheric pressure signal generated by the eruption of the Hunga-Tonga-Hunga-Haʻapai volcano (hereafter abbreviated as Tonga) is proposed. The model is used to explain the changes in the wave form of the observed signal with increasing distance from the volcano. The model is based on the solution of the linearized Korteweg de Vries (KDV) equation, which describes the change in the wave form of the Lamb wave as a function of distance from the source. The model signals are obtained as a superposition of the Lamb wave and the acoustic modes calculated by parabolic equation method for three infrasound stations (IS22, IS24, and IS30). These signals are compared with the observed signals The energy of the volcanic eruption is estimated from the pressure amplitude and characteristic duration of the signal. This work is supported by RSF grant N 21-17-00021.
One talk withdrawn
More than ten years after its development, the inspection team functionality (ITF) has unquestionably become the conceptual framework adopted by the inspection team during on-site inspection (OSI) field exercises. The expectation is that ITF will successfully guide the inspection team when conducting an OSI once the Treaty enters into force. The concept forms the backbone of the inspector training programme and relevant tools such as the Geospatial Information Management System for OSI (GIMO) deliver the ITF logic, decision and operational cycles. But the origins of ITF were far from easy. How was ITF born? What were the issues that the concept was trying to address? How did ITF move from just providing clarity on mission objectives, to creating a framework to think and work, to encapsulating many other aspects such as a robust search logic, a transparent information-flow mechanism, a controversial distributive leadership concept, and a companion field team functionality framework? This presentation provides a historical retrospective of the development of the ITF concept and explains the pillars that converted ITF into the concept of operations for an OSI.
The Chernobyl and Fukushima accidents have shown that the so-called “co-expertise” process is an effective lever for empowering the people concerned in order to give them the means to make informed decisions concerning their own protection. In the event of an on-site inspection (OSI) under the CTBT, inspectors and support staff of the inspected State Party are likely to encounter radioactive contamination in the environment of the inspection area. This situation has similarities to the experience of residents of an area with radiation hazards due to a past radiation emergency. For this reason, the co-expertise process can be a model for training on-site inspectors and inspection teams and addressing their concerns about the consequences of radiological contamination in the OSI area. After a reminder of the constituent elements of the co-expertise approach, this presentation describes how the latter could be adapted to serve as a support for the preparation for inspection interventions.
Success of an on-site inspection depends largely on two factors: identifying the locations of interest within an area of approximately 1000 km2 and evaluating time critical signatures enabling pinpointing ground zero. Visual observation is the technology that has developed to identify locations of interest in rapid assessment manner. This process allows concentrating the inspection to manageable extent, where detailed studies could be possible within the constraints. However, tropics pose a grave challenge in this maneuver as the ground visibility becomes almost zero in areas with thick tree canopies. Torrential rain is commonplace in tropics, which acts as a natural “eraser” for many on-site inspection (OSI) related observables. The time critical signatures, such as seismic aftershocks, tend to decay rapidly in thick-wet soft overburdens where energy absorption levels are very high. One of the most critical aspects of the OSI is the presence of radionuclide material in the samples. Rain coupled with erratic wind pattern tends to offset the potential source(s) and redeposit in elsewhere within a very narrow time window. Having hands-on experience of an Integrated Field Exercise in the tropics will provide critical and decisive experience towards adaptations to the conventional procedures, enhancing the IT functionality to cope with diverse environments.
The GIMO platform facilitates the implementation of on-site inspection (OSI) search logic by providing the framework and tools to meet the requirements of inspection team and field team functionalities, and data flow. This paper reports on the various instances of GIMO developed to cater for requirements at the operational support centre, the base of operations, the inspection area – including the laboratory - as well as data classification status. The architecture used as a basis for GIMO development, deployment and operation are summarized. Data security considerations are highlighted, with particular emphasis placed on the transition to the use of zero clients in the ‘working area’ for the processing of data and the development of ‘kiosk’ applications on tablets that restrict access to relevant applications only. Recent functionality developments are also presented including updates to spatial viewing tools; the ability to view technique specific metadata in the ‘working area’; workflows and tools to support the revision of on-going mission proposals; and tools to support the preparation and dissemination of technical mission reports and search zone summary reports within the inspection team.
Recent efforts have explored the use of high resolution beta detectors for improved metastable isomer discrimination. These detectors have been focused on implementation within the IMS stations for detection of all four radioxenon isotopes. The duration between collection and measurement is increased for radioxenon laboratories. Based on the shipping duration and sample type (spike versus environmental sample), there may only be Xe-133 and Xe-131m present in samples that are sent from an IMS station. Depending on the isotopes present, there is the potential that the choice of nuclear detector may be preferential for one type over another. In this presentation, we evaluate the implementation of plastic scintillator beta cells and silicon beta cells for radionuclide laboratory operation. As part of this, we evaluate the impact of potential sample activities and isotopic ratios.
Beta-gamma coincidence method are used to detect and measure radioxenon isotopes in the field, greatly reducing the background and increasing sensitivity. Research has been performed to explore further background reduction using spatial information provided by pixelated, voxelated, or segmented detectors. This includes segmented and pixelated silicon detectors for the beta or electron detection and voxelated cadmium-zinc-telluride (CZT) detectors for gamma or X ray detection. Segmentation or pixelation, in addition to providing spatial information, reduces the detector capacitance, allowing for lower thresholds at room temperature, providing access to low energy beta, electron, and X rays. With higher levels of pixelation, such as in a charge coupled device, the spatial information can provide particle identification, providing discrimination of alphas and muons from the beta and gamma signals of interest. The potential for increased sensitivity will be presented along with measurements from prototypical detectors.
This study describes a new technique of 4πβ-γ detection system to perform primary methods for the measurement of activity of radionuclide. Characterization of the detectors, plastic scintillators for β-detector and a NaI(Tl) detector for γ-detector, has been investigated by using Monte Carlo GEANT4 simulation and experimental measurement. In addition, an optimum configuration of plastic scintillator detector was examined to maximize electron detection probability and minimize photon interference probability at the same time. A hardware configuration of the 4πβ−γ instrument which utilizes a CAEN N6751 digitizer as the digital acquisition device has been implemented. The digitizer applied pulse shape discriminator procedure to process detector signals and generate binary list-mode files that represent measurement data of detectors. An offline-analysis technique based on the Python program for the coincidence counting was written to analyse the list mode data. This technique effectively applied time settings (such as dead-time and resolving-time), algorithm of coincidence counting and its correction, correction of background and decay, activity concentration calculation, and efficiency extrapolation with weighted linear fit as well. Using this technique, accurate calculations of absolute radioactivity measurement can be performed repeatedly offline since raw detector signals were first experimentally measured, reducing time and unwanted fluctuations in the experimental measurement.
Many different uncertainty sources can contribute to errors in the infrasound wave parameter (azimuth, trace velocity, amplitude) estimation. It is important to both understand how these sources can effect infrasound measurements, and how to minimize their effects on the infrasound measurements. One such a source is the detector, which comprises the wind noise reduction system and the microbarometer. A comprehensive uncertainty analysis was performed to better understand how the detector uncertainty affects the wave parameter estimation using the time delay of arrival (TDOA) algorithm, such as that used by the Progressive Multi-Channel Correlation (PMCC) algorithm. An in situ calibration of the detector was performed using a co-located reference sensor and the ambient signals that are observed at the site. This in situ calibration can be used to monitor the status of the detectors, and provide feedback to the station operators. In addition, the calibration results were used to provide corrections to the raw signal data for retrieval of the corrected wave parameters. Experiments were performed at the IS26 site using a temporary WNRS to provide quantitative measurements of the effects of the WNRS and its calibration on the wave parameter estimation. These were compared to the IS26 measurements, demonstrating that accurate measurements can be retrieved using the calibration.
Analysis of the seismic data quality provides a vital tool to identify seismic station problems. Timing accuracy, completeness, and ambient noise levels are considered as key data quality parameters. Variation in seismic station noise level and their deviation from a global noise model affects the capability of seismic event detection. Microseismic noise at a station is expressed by the power density spectra (PSD)and the ambient noise probability density function (PDF). Seismic network operators review data quality by visual inspection of the PSDs periodically, since the data quality might change over time. This process is subjective and demands a significant amount of time and considerable experience. A reliable automatic evaluation task makes quality control faster and more objective. This study aims to develop a fuzzy rule based expert interpretation system that can imitate human reasoning and incorporate the operator’s knowledge of seismic data quality. Using features extracted from PSDs and PDFs, the categorization system was built based on fuzzy interpretability rules. Results on real seismic station data showed the robustness of the interpretation and its capability to be a part of the routine seismic network operation.
In 1957, the international geophysical year, the very first seismic station was installed in Mongolia and seismic monitoring started from then on. The Mongolian Seismic station number of IAG of MAS has been increasing year by year, especially in the last 10 years after the transition to a digital station for the old seismic station. In 2013 new stations, such as eight broadband stations, five short period stations and 12 accelerometer stations were newly installed in the Mongolian territory. The detection capability and location accuracy of Mongolian seismic network has been increasing noticeably. A seismicity of Mongolia is recorded by The Mongolian Seismic Network. A sparse network at present and determining and improving detection capability and location accuracy of the Mongolian Seismic Network is important for seismic event detection in Mongolia. We will present detection capability improvement in last 10 years of MNDC in this poster.
The primary seismic network of the International Monitoring System (IMS) forms the backbone of the CTBT verification regime. Consequently, the performance of the network needs to be documented. A key parameter in this respect is the event detection threshold which can vary significantly with time during situations such as high station noise levels, large earthquakes or outages of key stations. NORSAR has in cooperation with International Data Centre (IDC) staff developed a web based tool for enhanced analysis and presentation of detectability maps based on the threshold monitoring methodology. Key elements of the application are: display of absolute and relative thresholds maps for time intervals selected by the user; options for setting of colour scales, contour levels and magnitude scales and analyses of detection thresholds for any user selected geographical area. The primary users of the web tool are National Data Centres, the CTBTO Preparatory Commission and other authorized users having access to IMS data and IDC products. We will demonstrate the functionality of the system and show examples of absolute and relative detection thresholds at global, regional and local scales.
A devastating earthquake doublet occurred on 6 February 2023 with moment magnitudes of Mw 7.9 and Mw 7.7 along the East Anatolian Fault (EAF) and Sürgü-Çardak Fault (SCF), respectively. The 2023 earthquake sequence resulted in catastrophic human life and economic loss, and caused major impacts to infrastructure throughout south-east Türkiye and north-west Syria. The kinematics of ruptures for the doublet was complex involving multi-scale cascading rupture growth across the hybrid fault segments. We find that the first earthquake (Mw 7.9) nucleated on a previously unmapped fault, Nurdağı-Pazarcık segment, before transitioning to the EAF leading to supershear bilateral ruptures on the initial branch, Pazarcık and Erkenek segments and subshear rupture on the Amanos segment. The dynamic stress of the leading branch rupture impulsively triggered the EAF segments accelerating the following bilateral supershear rupture of the second earthquake (Mw 7.7) along the curved fragments of the SCF with dominant westward rupture directivity, and stopping instantly at geometric barriers at both ends of the fault. Hence, the geometry and pre-stress level of multiple segments heightened the diverse rupture characteristics of the 2023 south-east Türkiye earthquake doublet, contributing to the strong ground shaking and associated devastation and amplified the ground shaking intensity.
The 15 January 2022 volcanic eruption of Hunga, Tonga produced atmospheric waves that astonished both scientists and the general public. These waves were observed globally by a multitude of instruments and technologies and were nearly ubiquitous across the IMS infrasound network. The most notable atmospheric wave was the Lamb-wave, which is an acoustic-gravity wave that is associated with extremely large atmospheric explosions. The Lamb wave was detected on barometers, infrasound sensors, seismometers, and satellites. This Lamb wave propagated around the globe numerous times and contributed to fast-arriving, hazardous tsunamis that were not forecasted. The Hunga Lamb wave resembled the Lamb wave produced by the 1883 Krakatau eruption, but it was observed by a much denser instrument network. Notably, infrasound waves also propagated around the globe numerous times, and audible acoustic waves were heard out to an unprecedented 9000 km. Current wave propagation models do not sufficiently explain these observations. Here we present some notable observations of the atmospheric waves from the Hunga eruption. We focus on the Lamb, infrasound, and acoustic waves, including those on the IMS and dense geophysical network in Alaska. The atmospheric waves from this eruption provide a landmark dataset for scientists to study for many years.
The 15 January 2022 eruption of Hunga Tonga-Hunga Ha’apai in the Tonga Islands was unprecedented in modern times. It was one of the largest volcanic explosions of the instrumented era and ranks as the most energetic volcanic explosion on Earth since the 1883 eruption of Krakatau (Indonesia). With its ash plume reaching high altitude, an intense volcanic lightning storm, atmospheric waves circumnavigating the globe several times and associated “meteo-tsunami”, and a gravity wave tsunami that travelled throughout the Pacific and was observed also in the Indian Ocean, the Atlantic Ocean and the Mediterranean Sea, it captured the attention of the global scientific community and the public. The cataclysmic eruption of Hunga Tonga Hunga Ha’apai presents a rare opportunity for researchers to explore new multi-disciplinary problems covering diverse aspects on water-magma eruption dynamics, remote monitoring of volcanoes, seismology, hydroacoustics, infrasound, satellite observations, volcanic lightning analysis, tsunami-genesis, and atmospheric impacts. It also leads the scientific community to review associated volcanic hazards including threat assessment and communication.
This panel will discuss which potential additional technologies would be useful and what progress has been made in demonstrating them.
This panel will discuss the eruption sequence, the use of IMS and non-IMS technologies and data, potential additional technologies that have been demonstrated with measurements relating to this event, lessons learned of interest for the IMS, the potential consequences on volcanic ash monitoring & tsunami detection, and the possible interest for collaboration of the CTBTO with international organizations such as World Meteorological Organization (WMO), International Civil Aviation Organization (ICAO), United Nations Office for Disaster Risk Reduction (UNDRR) and Intergovernmental Oceanographic Commission of UNESCO.
The geophysical instrumentation of the CTBT worldwide is important for characterizing nuclear and chemical tests. These observatories have also strengthened the geophysical monitoring capabilities of solid-earth processes like large earthquakes and volcanic eruptions on Earth. However, assuring the correct functioning of this instrumentation is hard under extreme conditions of temperature, humidity, flora coverage and geomorphology of the tropical forest. The site Juntas de Abangares, Auxiliary Station 25 (JTS ) is an example of the needs and solutions that must be carried out in these climatic conditions such as the Costa Rican jungle is. We found that developing a redundant infrastructure for both, seismic recording and telecommunications (VSAT and cellular modem) allows to transmit in real time while having a robust backup system. Under extreme conditions the use of UTP cables for data transmission can generate a long period (120 s) electronic noise that dramatically affects the quality of the data. The replacement of the UTP cables for fiberoptics reduced the effects of high temperatures and humidity of the tropical climate in the data. Furthermore, the use of external antennas helped to stabilize data transmission, since current bunker infrastructure decreased the power of cellular communication and transmission of seismological data.
In Indonesia, the CTBTO auxiliary seismic station network consists of one data center (NDC-BMKG) and six auxiliary seismic stations. Each station is in a different environmental condition and faces a unique set of troubleshooting challenges. According to the incident's history over the last five years, power and on-site communication issues were prevalent. The implemented troubleshooting is inadequate and it remains challenging because the issue persists/recurs. To achieve a long term solution, the station is required to be redesigned and existing equipment must be optimized while considering all existing constraints.
Active seismic survey, together with aftershock seismic survey, magnetic and gravitational field mapping and electrical conductivity measurements (Protocol to the CTBT, Part II), are geophysical detection technology which can contribute jointly to the search logic for the detection of on-site inspection (OSI) anomalies or artifacts underground. This work presents a system solution to active seismic survey based on optical fiber geophone arrays, which meets with the technical requirements of OSI ACT. The system consists of 800 channels (up to 1000 channels) that include three component seismic geophone working in the frequency band 1-500 Hz (up to 1000Hz). The vehicle engine driven power system can support several days of operations and storage capacity can support several days of measurement. Deep learning for denoising based on Fast and Flexible Convolutional Neural Network (FFCNN) has been applied to weak signal recognition. A user friendly 2-D/3-D human-machine interactive data interpretation software platform has been developed for data visualization and analysis. This system is adaptable to any seismic sources, such as explosives, vibroseis, weight drop and sledgehammers. This system has been tested and verified in field and can be suitable for future OSI Integrated Field Exercise.
The article explores the possibility of applying the mathematical framework to map the radiation field transmission zone in the area of nuclear tests, as well as the visualization of the built model using virtual reality technology. In turn, the possibility of applying the calculus to study the accumulation of fissile materials in the zone of a supposed nuclear explosion is considered. The authors describe this technology as the basis for analysing the probability of environmental contamination by nuclear products in the area of nuclear tests, as well as the impact of radiation fields on organisms in this environment. While nuclear testing was banned by the 1996 Comprehensive Nuclear-Test-Ban Treaty, the possibility of some countries conducting tests on their territories cannot be neglected. Therefore, there is a increasing demand for digital technology to accurately calculate the physical parameters of a nuclear test and to visually simulate its consequences in order to verify the fact that nuclear explosions occurred.
Spot check is the final level of the interactive review hierarchy aimed at the improvement of Reviewed Event Bulletin (REB) quality. At the International Data Centre (IDC) this task is conducted by an independent reviewer whose responsibility is to investigate event location anomalies, missing or bogus large events representing invalid data associations or incorrect detections. To facilitate the work of the reviewer, SCT software was developed as an implementation of the master event approach, which allows the reviewer to see how this event has been processed compared to similar events in the historical archive. The information backbone of the SCT is a historical IDC database allowing for the testing of event hypotheses with the archived events. The automatic SCT software generates daily cross-correlation event bulletins and conducts comparison with SELs and REB bulletins. The interactive web based front end of the SCT makes available visualization of the results via maps and tables. SCT also provides a reviewer with an interactive mode of operation, allowing event selection and intuitive processing configuration. Additionally, the SCT tool assists expert technical analysis or special study. The tool is designed to provide various access levels, so internal reviewers, IDC analysts and external users could securely utilize specific SCT features.
9 video rooms in parallel for topics: P2.1, P2.3, P2.5, P3.2, P3.6, P4.1, P4.2, P4.3, P4.4
We present an improvement to the multichannel maximum-likelihood (MCML) method. This approach is based on the likelihood function derived from a multi-sensor stochastic model expressed in different frequency channels. Using the likelihood function, we determine, for the detection problem, the generalized likelihood ratio with a p-value threshold to discriminate signal of interest and noise. For the estimation of the slowness vector, we determine the maximum likelihood estimation. Comparisons with synthetic and real datasets show that MCML, when implemented in the time frequency domain, outperforms state of the art detection algorithms in terms of detection probability and false alarm rate in poor signa to noise ratio scenarios. However MCML cannot account for interfering coherent signals in the same time frequency band. Different deep learning architectures are being explored to predict the number of sources in a given time frequency window. We illustrate the potential of such approach on synthetics and real data from the International Monitoring System to estimate multiple wave parameters associated with overlapping coherent signals from different sources.
In this paper we conduct a study of on-site estimation of the time of arrival (ToA) strongly distorted acoustic signals. Propagation of the acoustic waves from sources is controlled by a complex interplay between source location, winds, temperature, humidity, atmospheric attenuation, and topography, which are the main factors that lead to signal distortion. An algorithm was developed (usable on-site) that uses several ways of filtering signals and has the ability to estimate ToA without relying on cross-correlation of signals or their pseudo sequences. Wavelet signal analysis and the fourth order cumulant are used in order to better extract the useful signal. The ToA values obtained in this way were successfully evaluated in algorithms for acoustic localization at large distances.
This study aims to explore novel physics-guided neural network (PGNN) algorithms as applied to time-domain source functions (TDSF) of explosions for automated emplacement material classification as well as the estimation of yield (W) and depth of burial (DOB) in a regression formulation. We assume that explosions are detonated at the center of cavities embedded in an infinite homogeneous medium. TDSFs were constructed for different source emplacement conditions using analytical expressions at the elastic radius of each explosion source, where this elastic radius includes both the cavity radius and the non-linear zone created by each explosion. In addition, a tabular dataset for many combinations of W and DOBs was generated containing corner frequency (fc), halfwidth of the displacement wavefield in seconds, peak amplitudes of both broadband and filtered waveforms around 1 Hz, periodicity, and area under the peak pulse for each TDSFs. Note that these parameters follow a non-linear dependence on W and DOB. These TDSFs and tabular datasets are being used to explore the accuracy in the classification of emplacement conditions and post-processing validation of the physics-informed neural network (PINN) with and without governing laws and regularization of the PINN model. We are further evaluating the out of distribution network performance.
This study uses 48 000 synthetically generated time-domain source functions (TDSF) to illustrate the performance of newly developed physics-guided neural network algorithms for classification of the emplacement conditions in materials where the explosions are detonated. TDSFs were constructed at the elastic radii of many explosions for material properties representing the granite, shale, tuff-rhyolite, wet granite and wet tuff emplacement conditions at the source. For each material type, we allowed yield (W) to vary between 10 tonnes to 2 Kts and depth of burial (DOB) to vary between 100 m and 900 m, respectively. The effect of the one dimensional wave-propagation path models including the attenuation by convolving TDSFs with synthetic path seismograms for stations located at 100 km to 1200 km was further investigated. Different levels of real-time broadband noise and source complexity comprising of both isotropic and non-isotropic sources were added, and compared the performance of the algorithm in the material classification and estimation of W and DOB to the neural network algorithm without governing laws and regularization of the physics-guided method. Results will be presented showing the performance for the event classification applying the algorithm individually to the Pg, Pn, Sn, Lg amplitudes, including the Pg/Lg and Pn/Sn amplitude ratios.
The Comprehensive Nuclear-Test-Ban Treaty (Treaty) of 1996 obliges its signatories not to undertake nuclear weapons tests. Kenya is a signatory to the Treaty and operates an infrasound and a primary seismic station in Nairobi. The city has a 2.8% urban growth rate and in 2019, its population was 4.3 million. An increase in anthropogenic activities directly impacts the International Monitoring Stations in Nairobi, causing more noise in the data. Artificial intelligence algorithms exist which can separate noise from seismic signals. Machine learning algorithms can automatically learn from such data, identify patterns and make decisions while retaining the ability to analyse previously unseen patterns. Deep learning algorithms can apply multiple layers to extract higher level features from the raw noise data. Stanford University developed CRED for seismic event detection, DeepDenoiser for signal denoising analysis, EQTransformer for phase picking and MagNet for magnitude estimation. Thibaut Perol developed ConvNetQuake, which identified 17 more earthquakes than the Oklahoma Geological Survey. Intelligent algorithms can be a viable strategy to reduce anthropogenic noise and hasten the analysis of data. The costs and technical challenges of implementing these algorithms could be mitigated through suitable partnerships between the Comprehensive Nuclear-Test-Ban Treaty Organization and the relevant research institutions, researchers and States Signatories governments.
The classification of low magnitude seismic events is an important task in regional earthquake monitoring. This study focuses on the classification of earthquakes, explosions, and mining-induced earthquakes. A 36-dimensional feature extraction dataset was established through eight types of feature quantization methods, and two-classes and three-class models were respectively constructed by the Extreme Gradient Boosting (XGBoost) algorithm. The P/S amplitude ratio has long played an important role in earthquake/explosion classification, the importance scores of 36 features calculated by XGBoost and Random Forest differ slightly, but the high frequency P/S amplitude ratios all ranked higher. By comparing the performance of the classifiers based on the feature extraction dataset and the waveform spectrum dataset, it was obtained that the feature extraction method can effectively highlight the differences between different types of seismic events. The accuracies of the classifiers constructed on the feature extraction dataset can reach 90%, among which the accuracies of the earthquake/explosion, earthquake/mining-induced earthquake classifiers were as high as 97%. Finally, the generalizability of the classifiers was verified by using the data in the study area and outside the area, which showed that the classifiers constructed based on the feature extraction dataset had high test accuracies and strong generalizability.
Detection, localization, and characterization of energetic events in the atmosphere and at shallow depth of burial using infrasonic signals is often performed via an automated pipeline framework with refinement using interactive tools for an identified event-of-interest. Recent updates to the InfraPy signal analysis software suite authored and maintained by infrasound experts at Los Alamos National Laboratory include expanded command line and graphical user interfaces that enable such analysis using an adaptive detection algorithm as well as Bayesian event identification, localization, and characterization methods. Further, recent investigations of a machine learning based approach for infrasound signal detection and classification have demonstrated highly accurate identification of transient and persistent signals. An overview of the InfraPy software suite, in-development machine learning based methods, and example analysis of recent events-of-interest using the software will be presented.
In explosion monitoring, seismic arrays offer significant signal to noise improvements and enhanced ability to detect and locate important events. This advantage can be significantly degraded, however, in the presence of noise on the array components. For automated processing methods, this may result in a failure to identify an important signal, since many methods use thresholds. For a chronic issue on one channel, it can be eliminated a priori, but intermittent noise is more difficult to isolate.
We are building a tool to perform quality control on seismic arrays over long periods of time, to identify when a channel experiences an intermittent problem and which is the problematic channel. The tool exploits a jackknifing method with running singular value decomposition for time-windowed array data, followed by SVD clustering to isolate an anomalous channel. The underlying tool has a GUI that extracts array waveforms directly from a relational. The GUI will apply high-pass or low-pass filtering of the waveforms before performing the SVD/cluster analysis. Times of channel anomalies are reported in a file written to disk, and can be viewed on screen. Our goal is to enable a user to incorporate the list of times and channels into their automated analysis.
Seismic noise from a variety of nuisance sources frequently contaminates signals of interest. Effectively suppressing this noise is a crucial step in the processing pipeline. In a previous work, Tibi et al. (2021) developed a seismic signal denoising approach that uses a deep convolutional neural network (CNN) model to decompose an input waveform into a signal of interest and noise. While effective, this model, however, was limited in that it was trained on regional data from Utah, using recordings from vertical components only. In this study, we evaluate the transferability of the CNN denoising approach to global regions using data from the International Monitoring System (IMS) seismic station network. To train and test the denoiser, we curate high quality signal and noise datasets of seismograms recorded by stations of the IMS network and use them to construct >100 000 noisy waveforms. Furthermore, we extend the current methodology, which only utilizes the vertical component, to three components, resulting in a denoising model across all three components. In doing so, we demonstrate the validity of using machine learning derived datasets of noise in place of manually curated datasets which greatly reduces analyst time and effort.
Seismic waveform data are generally contaminated by noise from various sources, which interfere with the signals of interest. In this study, we implemented and applied several noise suppression methods. The denoising methods, consisting of approaches based on nonlinear thresholding of continuous wavelet transforms (CWTs), convolutional neural network (CNN) denoising and frequency filtering, were all subjected to the same analyses and level of scrutiny. We found that for frequency filtering, the output SNR decreases significantly faster with decreasing input SNR. For most of the input SNR range, the quality of the output waveform for CNN denoising in terms of output SNR and amplitudes is superior to other approaches. Our results suggest that in terms of degree of fidelity for the denoised waveforms with respect to the ground truth seismograms, CNN denoising outperforms both CWT denoising and frequency filtering. Depending on the purpose of the analyses for which the denoising task is performed, these findings have important implications. For instance, if the purpose of the analysis is to exploit the amplitude information of the seismograms for magnitude, yield, or moment tensor estimation, among the methods evaluated, CNN denoising would be the most suitable approach.
Automatic detection of seismic events in processing pipelines at the International Data Centre and many National Data Centres is mostly done using beamforming on arrays; however, extensive use of single stations can improve the detection capability and accuracy of event location. Advances in deep learning methods enable faster and more accurate processing of large quantities of single station data not seen previously. We use event catalogues including phase picks on a range of arrays in Scandinavia at regional distances (200-2000 km) i.e. up to 3 min separation between P and S arrivals, to train several deep learning models (PhaseNet and EQTransformer variants) using single stations within the arrays. The models are trained on clips of 324 s to capture the multiple arrivals. The models are then applied to various single stations in Norway to assess their generalization. We can detect events at a variety of back-azimuths and distances.
Furthermore, we expand the existing deep learning models to provide predictions for back-azimuth and distance. This imposes physical restrictions on the models, leading to increased picking accuracy for the predicted phase arrivals. Moreover, this enables us to use the vast number of single stations available to efficiently detect and locate distant events.
The International Data Centre (IDC) estimates several types of seismic magnitudes. Two of them are: the body wave magnitude mb, and the surface wave magnitude Ms. Both measures are significant to the CTBT verification regime as an input for discrimination methods between earthquakes and explosions, while mb is used for yield estimation for a presumed explosion. The IDC, like other institutes, estimates event magnitudes in two steps. First, the event magnitude is estimated for each IMS station detecting the event. The network magnitude is then computed as the average of station magnitudes excluding outliers. This approach rests on the assumptions that stations magnitudes are unbiased and have the same noise level (namely, random estimation errors).
We show that these two assumptions do not hold for mb and Ms as published in the Reviewed Event Bulletin of the IDC. We suggest a different approach, whereby individual stations each have different and unknown biases and noise levels. We present algorithms to estimate these stations' biases and noise levels. We use our estimated station biases as station correction terms, and the estimated noise levels as weights for event magnitude estimation. We show that our approach yields more consistent station and event magnitudes, using Reviewed Event Bulletin data.
Array processing is routinely used to measure apparent velocity and back-azimuth of seismic arrivals at the International Data Centre and many National Data Centres. Both quantities are measured under the plane wave assumption and are used to classify the phase type and to determine the direction towards the event epicentre. However, structural inhomogeneities can lead to deviations from the plane wave character. We suggest a combined classification and regression neural network to determine the phase type and back-azimuth directly from the arrival time differences between all combinations of stations of a given array, without assuming certain wavefield properties. It is trained using P and S arrivals of over 30 000 seismic events from the reviewed regional bulletins in Scandinavia of the past three decades. Models for the ARCES, FINES and SPITS arrays are trained. Very good performance for seismic phase type classification (up to 99% accuracy) and low source back-azimuth misfits are obtained. The SPITS array in Svalbard exhibits particular issues when it comes to array processing, and we show how our new approach better handles these obstacles. Finally, a systematic test of the performance compared to the results of the existing array processing pipeline at NORSAR was conducted in case of the ARCES array.
The Sliding Information Distance (SLID) metric is a compression based metric that can identify signal arrivals in seismic data. SLID has advantages over existing algorithms that are used for detecting arrival times because it can be used to calculate the certainty of each automated detection and to denote multiple possible arrival times in cases where the detections have low certainty. This information can be used to prioritize detections for analyst review. SLID can also be applied at the event level, using high confidence detections to refine the timing of lower confidence detections at other seismic stations. We present background information about how SLID is applied to seismic data, both at the station and at the event level. We also discuss how the uncertainty information produced by SLID can assist analysts by providing transparency into the algorithm’s arrival time picks, refining the timing of automated picks made by SLID or other algorithms, and directing analysts’ attention to likely arrival times within a particular waveform. All of these features can assist analysts with triaging and analysing seismic data.
This work expands on a new method, called event-based training (EBT) which is primarily a tool to leverage large datasets with little or no ground truth, to build event discrimination models across the continental United States. We include data from the transportable array and other regional catalogs as well as including a null criteria to enable a model to abstain from decision making in the absence of sufficient evidence. This work also extends Bayesian deep learning to assess the interplay between decision abstention and uncertainty assignments for use by analysts. This research benchmarks how appropriate EBT is for local to regional scale event characterization in the absence of abundant ground truth broadly.
The infrasound array in Hungary at Piszkés-tető (PSZI) has been collecting data since 2017. For signal processing, the Progressive Multichannel Cross-Correlation (PMCC) method is used, which resulted in about a million detections so far. Among these detections there are about 10 000 categorized, hand labelled events from quarry blasts, storms and power plant noise that constitute the dataset for training and testing. We extracted both time and frequency domain features from the raw waveforms, and also calculated PMCC specific features. For event discrimination purposes we tested two machine learning algorithms, the Random Forest and Support Vector Machine methods. These classifiers were trained to separate quarry blasts from storms and coherent noise from the nearby power plant. We measure the performance of the classifiers with the f1 score, and analyse the confusion matrices. For both classifiers the results reach 0.9 f1 score.
T waves signals at International Monitoring System hydrophone stations can present complex arrival characteristics caused by bathymetric features along their long range propagation paths through horizontal reflection, refraction, and diffraction. Thus, the interpretation of recorded T waves can be challenging due to differences in observed and expected arrival times, back-azimuths, or energy intensity. This work presents and discusses the use of high performance computing to simulate T wave propagation using a 3-D broadband Parabolic Equation (PE) model. A GPU, a single core embarrassingly parallel, and a multi-core version of the 3-DPE model are applied to simulate a T wave spectrum represented by up to 1200 frequencies ranging from 1 to 30 Hz. Single frequency model results are then synthesized to construct time series solutions. Finally, modeled results are compared with data recorded at the CTBT-IMS hydrophone station HA10 in Ascension Islands (Atlantic Ocean) for some specific events. A good agreement between modeled and observations is found for the arrival time and back-azimuth of the direct and reflected acoustic paths. Overall, results highlight the importance of accelerated computing to understand 3-D effects on T waves propagating in the ocean, which cannot be computed in reasonable times using traditional computing technics.
Seismic event source identification using recorded signals could be a complex task to solve using classical mathematical methods. Alternatively, many recent research studies have opted for artificial intelligence techniques to deal with this classification problem. Indeed, artificial neural networks, particularly multilayer perceptron (MLP), are one of the techniques that have achieved good classification result. However, the most critical step in using MLP is feature extraction. The employed features can significantly affect the classifier performance. The advance achievements in graphical processing units have enabled the implementation of deep learning based classifiers which overcome the necessity of feature extraction. Consequently, deep learning approaches could be more objective and efficient as signal features are not specified by the user. In fact, more research studies should be devoted to this field in order to develop more reliable classifiers. The aim of this study is to investigate the performance of a deep neural network on seismic signal classification. To do so, several experiments have been performed on a seismic database of four classes. The obtained results show the ability of this classifier to achieve high accuracy without requiring any subjective signal pre-processing.
Large earthquakes and aftershock sequences substantially increase the burden of those who monitor seismic data for anthropogenic events. Fortunately, most aftershocks exhibit high degrees of similarity between their waveforms, making them well suited for detection and identification through waveform cross-correlation techniques. Such techniques pose some challenges, however, such as building and maintaining template libraries. To mitigate this, we test the effectiveness of using deep learning to detect and characterize aftershocks by training several paired neural networks (PNN) using different training datasets. The first PNN models we test are built with seismic data constructed by adding high SNR, real signals with real background noise at various amplitudes; we also include some overlapping event signals and apply different filters to the constructed waveforms to make our training datasets more realistic. These models, based on constructed data, are then tested with data from two real aftershock sequences in Chile and Nepal that were built in a prior cross-correlation study and then validated by an expert analyst. We will investigate the viability and limitations of this method by exploring model transportability to other geographic regions, further tuning existing models, and training new models using real events.
We train deep learning models for seismic detection on 3-component stations from the International Monitoring System (IMS) based on PhaseNet architecture and evaluate the results using the Unconstrained Global Event Bulletin (UGEB). Using 14 years of associated signals from the Late Event Bulletin (LEB), we auto-curate a training data set consisting of signal windows containing associated arrivals, and noise windows that contain no LEB associated signals. We construct five training data sets by varying the ratio of noise windows to signal windows and found that increasing the number of noise windows increases the precision from .15 to .4 while reducing the recall from .6 to .5. Using the SeisBench Toolbox, we compare eight PhaseNet models trained on non IMS data on the UGEB and show the best SeisBench model achieved a .24 F1 score versus .49 F1 score for our best IMS models. We qualitatively compare the PhaseNet response curves to the STA/LTA response for true positive detection, false positive detections generated from windows with associated signals, and false positive detections generated from windows with no associated signals. Finally, we find that the primary benefit of training with LEB data is not in detecting more signals, but rather the suppression of noise detections.
We present results from our use and analysis of the most recent release of NIAB, which contains the NET-VISA associator integrated into SeisComp3 (SC3). This version allows for the configuration of non-International Monitoring System (IMS) stations using both the International Data Centre (IDC) DFX detector and NET-VISA. Non-IMS stations from the Australian network and from other regions of interest were integrated into the system and results are presented. We made a comparison of Australian earthquake catalogue with the automatic bulletin produced by the NIAB system. Further, as NIAB now allows for the calculation of IDC magnitudes, (mb_ave and mppln) in both an automatic and interactive mode, these results are also compared with magnitudes in the IDC bulletin. A detailed review of some notable events detected automatically by the NIAB software within Australia and nearby regions is made. These include the Hunga Tonga submarine volcano in January 2022. Finally, this version of NIAB allows for the automatic creation of mixed events i.e. using seismic, infrasound and/or hydroacoustic detections. We present and review examples of these events. Overall it is shown that the current suite of software performs well and can be a reliable verification tool with the ability to satisfy requirements of both regional and local monitoring.
The range-dependent Acoustic Model (RAM) is a prevalent underwater acoustics modelling program, which employs a 2-D parabolic equation method. Parabolic equation is known as an accurate and reliable method, and it has been extensively used in the underwater acoustic community. Even though the parabolic equation method requires less computation resources than many other methods, an efficient computational framework is still needed especially for broadband simulations. These broadband parabolic equation simulations essentially implement the Fourier synthesis method to compute waveform time series from frequency spectra. Therefore, it requires simulating a single environmental model with thousands of frequencies of a sound source. The first option to tackle this is the “embarrassingly parallel” methodology that distributes small instances over many processors. However, the latest processing units, such as SIMD and GPGPU, may not necessarily suit the embarrassingly parallel methodology for RAM-broadband. In the present study, we will implement several methods to accelerate RAM-broadband simulation on the latest processing units and discuss the performances.
CTBTO observation and processing systems are required to be sensitive to low magnitude events. A promising way to increase system sensitivity and improve station tuning is to refine the receiver velocity models underneath International Monitoring System (IMS) stations by incorporating a number of ambient noise processing techniques into the International Data Centre (IDC) practice. In particular, this approach should lead to reduction of arrival time residuals between empirical and observed onset times of seismic waves. A basis for that is a vast amount of seismic noise data acquired in the IDC for more than 20 years. We conducted a case study for ARCES IMS array in Northern Norway, which consists of 4 rings of 3C broadband shallow vault seismometers. In addition to building an averaged uppermost ARCES velocity model, we demonstrate the trial application of Ambient Noise Tomography methods for the individual model retrieval at different flanks of spatially distributed sensors comprising seismic arrays as a generalized way to aggregate the block velocity models. Examples for other geometries and regions are also provided. For enhancement of CTBTO on-site inspection seismic aftershock monitoring system, the same approach can be utilized by retrofitting velocity models produced with the noise data collected from the temporarily on-site inspection array.
This paper summarizes the advances in the application of machine learning to seismic monitoring data processing. Then it focuses on our work including local events detection based on multi-task Convolutional neural network (CNN), Generative Adversarial Network - Long Short-Term Memory(GAN-LSTM) joint network applied to seismic noise signal recognition, the seismic phase sequence detection based on transformer, and seismic event association based on probabilistic models.
Finally, the trend of the development and potential challenges with machine learning applications are discussed.
Discriminating the event type with seismic waveforms is a vital part of the verification work. In this research, 500 seismic events with a magnitude below 3.0 ML around Beijing were collected in the categories of: natural earthquake, blasting and mining collapse. More than 25 features and their ratio parameters were applied to characterize the seismic waveforms, which were extracted from P and S waves in each trace, and the mean value of characteristics of the seismic event was calculated. SVM, Fisher, and Bayesian probability methods were utilized for the event type identification. It is found that the SVM method exhibits the highest identification accuracy, followed by the Fisher method. The Bayesian method ranked third. In addition, the identification accuracy can be improved by adding the ratio parameter of P and S wave features. However, under the same condition of characteristic data, the identification accuracy based on P and S wave features is lower than the identification accuracy based on mean features of events.
This study reviews performance of NET-VISA in operations (vsel3) and results of a full pipeline test of NET-VISA from 2021 in comparison with SELs. NET-VISA performed better than the standard global associator, GA, however several differences in how these algorithms form events were identified. These differences provided fewer constraints in event formation for NET-VISA in comparison with GA, namely creating some events that are not compliant with existing event definition criteria (EDC). Applying the EDC to the NET-VISA output lessens the performance of NET-VISA by 5%. Application of the EDC affects the number of events created in the automatic bulletins, which are reviewed for inclusion into the Late Event Bulletin (LEB). LEB events must then comply with the EDC before entering the Reviewed Event Bulletin (REB). The number of LEB events not propagating to REB increased with time from 4% in 2002 to 26% in 2022 and is correlated with the increase in number of stations, contributing to the bulletins. The development of the EDC dates from GSETT-3 and were designed to limit the load on analysts and automatic processing. Accumulated experience in processing and analysis over last ten year provides sufficient data to revise existing EDC and propose alternative approaches to producing products for verification purposes
The objective of this study is to improve modeling underwater ambient noise below 100 Hz from local and distant wind. Historically, shipping is assumed to dominate ambient noise at this frequency band, however, the CTBTO hydroacoustic array off Crozet Island provides unique wind noise observations with minimal shipping interference. First, ambient noise is correlated to overhead windspeed through a simple frequency dependent power relation, and second, the distant wind contributions are modeled through a source density and propagation model. Wind-related noise is modeled as a layer of monopole sources located at a quarter wavelength below the surface. The ambient noise and source level (SL) are related to wind speed (U) through a power relation such that, SL=A(f)+10n(f)log(U) and AN=B(f)+10n(f)log(U) where A, B, and n are frequency dependent coefficients estimated from the acoustic data. An important observation is that the n parameter increases as frequency decreases and reaches a value above 7 at 10 Hz, which is much larger than the 3.5 often measured at 300 Hz. The source layer model accurately predicts the ambient noise within a standard deviation of 2.5 dB and is necessary with low overhead wind speeds.
Seismic waves are emitted by events such as earthquakes, explosions, or other movements of materials within the Earth. The International Monitoring System comprises of a global network of seismic stations intended to identify and locate such events. The accurate estimation of the back-azimuth, or the angle of arrival, of these seismic signals remains a key challenge. Over the last decades, many azimuth estimation algorithms have been proposed. Additionally, recent years have witnessed a dramatic success of machine learning algorithms. Hybrid model-based/data-driven methods, such as the newly proposed deep augmented MUSIC (DA-MUSIC) algorithm, combine the advantages of both worlds. In this work, we implement and test the performance of the DA-MUSIC method on seismic data. The estimator’s performance is compared to those of: 1) Broadband MUSIC; 2) Classic MUSIC; 3) Maximum Likelihood Estimator; 4) Beamformer; and 5) a Random decision. We use data recorded by the GERES array located in the Bavarian Forest, Germany, and data obtained by the MMAI array in Israel. We study different challenging scenarios corresponding to array mismatch, broad frequency ranges, and limited measurement windows. The results show that DA-MUSIC outperforms existing methods and indicates versatility and robustness, allowing the application to real seismic data.
In seismic signal analysis it's crucial to be able to distinguish between earthquakes and underground nuclear explosions, which is an important component of the Comprehensive Nuclear Test-Ban Treaty Organization (CTBTO). Many methods have been used, such as the complexity, the spectral ratio, body wave and surface wave magnitudes (mb-Ms), and P and S corner frequencies. The data set of nuclear explosions and earthquakes with body wave magnitudes MB ranging between 4.5 and 6.5 in different regions around the world, such as China, India, Pakistan, North Korea, and the USA, have been collected using broadband seismic stations from different networks, including the International Monitoring System network and the International Research Institute for Seismology. The purpose of this study is to apply various approaches of machine learning based on the output features from the previous methods. These approaches, including logistic regression, the K-neighbors classifier, the decision tree classifier, the random forest classifier, the voting classifier, the XGB classifier, and Naive Bayes, had been widely used to automatically discriminate between underground nuclear explosions and large earthquakes. The performance of our proposed discrimination algorithms was proven by the receiver operating characteristics (ROC) and area under the ROC curve results.
All States Parties have convenient access to all International Monitoring System (IMS) data, International Data Center (IDC) products and all applications and scientific studies programmes used in the IDC of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). We took advantage of this as the Iraqi National Data Center. in this study most of the research and studies related to the analysis of seismic waves were reviewed using digital signal processing methods and artificial intelligence. Where the data of seismic waves are digitally analysed using modern digital analysis and processing methods to obtain faster analysis after to receive waveforms of the event through the use of classification intelligence algorithms. Data has been studied of the Arabian Sea earthquake seismic event on 26 October 2022 at 23:00:07 which was detected by two IMS monitoring technologies and non-IMS which is implemented in this work is the inclusion of analysis data using HA1 IMS (hydroacoustic station) with event location integration with seismic data for stations near that seismic event. As well as seismic event which was also studied and evaluated in Turkey, detected by IMS and non-IMS stations in IRIS on 23 November 2022 at 01:08:15.
The first step in most seismological studies is detecting seismic events. Due to attenuation law, the interstation spacing of seismic networks plays a fundamental role in the capability of phase detection, so the inappropriate density of seismic stations reduces the detection capability. The immediate solution to this problem is to increase the density of stations. But, this is too expensive to build and maintain. The cheaper solution is to improve the processing techniques to get more from the stations we have. The waveform cross-correlation technique (Matched-Filter) is a signal processing approach that is a powerful tool for detecting signals with a lower signal to noise ratio in the case of known sources. In this study, Matched-filter was performed on 95 days of continuous data with 13 temporary seismic stations installed by the IIEES to monitor the aftershock sequence of the August 2014 earthquake on MORMORI-ILAM with a local magnitude of 6. The primary catalog consists of 1105 aftershocks caught by visual and conventional methods and 838 of them were selected as templates. The Matched-filter technique detected 3575 aftershocks (4.27 times of reference). The detections were classified into some groups from the reliability point of view and correctness.
From 2000, when official bulletin production at the International Data Centre (IDC) of the Provisional Technical Secretariat (PTS) started until 2012, comparisons of the Reviewed Event Bulletin (REB) with the Published Bulletin of the International Seismological Center (ISC) had been carried out. As numerous efforts have taken place since then, such as enhancing the automatic as well as the interactive bulletin production system, a fresh look at the performance of the REB with respect to the ISC bulletin for a recent year seems to be in order. Therefore, a comparison has been carried out for the most recent year of ISC bulletin availability, namely 2020. Preliminary results indicate that the performance has not changed significantly with respect to accuracy, observing anew a portion of 95 percent of common events located within 1 degree of each other and 90 percent within half a degree. Furthermore a significant drop of common IDC-ISC events occurred, from around 30 000 events per year to about 25 000 events per year. IDC only events, not updated by ISC, are now relatively less frequent than before.
This study presents a new method based on wavelet transform for moment tensor inversion. The method consists of a semi-automatic data preparation and source inversion prepared in python. The procedure is similar to the point source technique in the time domain but is adjusted using wavelet coefficients. The advantage of our algorithm is to test the centroid moment tensor in all frequency content using lower samples compared to the time domain. We show the results of the accuracy and reliability of the method to obtain the complexity of earthquake sources with synthetic tests and real observed seismograms.
IS42 was certified on December 2010 and is one of the infrasound stations of the International Monitoring System (IMS) of the Preparatory Commission of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The station is installed on Graciosa Island (Azores archipelago, Portugal), in the North Atlantic Ocean. The layout of eight array components together with its geographical position allows the detection of a wide range of infrasound signals produced by variety of local, regional, and far-field sources. After a first study where we aimed to compile major near- and far-field detections of explosive volcanic activity, we present a first compilation of the IS42 infrasonic detections from 2011 to 2020 that is currently under progress. This first catalogue project aims to locate, characterize and categorize coherent infrasound sources, and distinguish them from environmental noise and other events of interest such as volcanoes explosions, earthquakes, bolides, microbaroms or anthropogenic sources. We also present the detections obtained from events listed in the Standard Event Lists and in the Reviewed Event Bulletin of the International Data Center for the same time.
Seismic events are produced by different types of sources (e.g. local earthquakes, distant earthquakes, quarry blasts, nuclear explosions, volcano activity, etc.). These events are detected by seismic network stations and stored. The first important task in seismic signal processing is to identify the source of each detected event. This task should be performed automatically due to the large amount of data recorded daily. In some cases, it needs to be achieved almost in real time in order to launch an alarm. Various approaches have been proposed in the literature. Most of these approaches necessitate extraction of the event signal features, like shape, length, frequency content, moments (covariance, skewness and kurtosis) etc. In this work, we propose an easy and straightforward classification method which does not require any feature extraction. This method is based on the cross-correlation function. The application of this method to a real seismic database of different classes shows that it can achieve good classification results.
PhaseNet and EQTransformer are two deep learning methods most commonly used for automatic earthquake detection and phase-picking. This work presents the performance of these algorithms using two weeks of data retrieved from the Botswana Seismological Network stations. Additional data used is obtained from the International Monitoring System network within a distance of 250 km from Botswana (e.g. Lobatse and Boshof). The results of PhaseNet and EQTransformer are recorded and compared with the results of the STA-LTA method. The accuracy, precision, recall, and F1 score are calculated for each method. Preliminary results show that the deep learning algorithms outperform conventional phase-picking methods in noisy data and where events overlap. Events that may be ambiguous for traditional methods are often clear outliers within the DNN model domain, as observed in this study.
The International Monitoring System includes waveform sensor stations connected to a centralized processing system in the International Data Center. Recent tools at the IDC use Bayesian analysis to detect and localize seismic events, such as NET-VISA. While the Bayesian approach to seismic monitoring can improve significantly on the performance of classical systems, this approach needs prior probability distributions and still relies on expert judgments, especially to incorporate environmental knowledge or attenuation. Inspired by recent works in applying graph neural networks (GNNs) in graph structured data, we developed an adaptive GNN for estimating earthquake locations and magnitudes. Our GNN uses a combination of autoencoders, convolutional neural networks, and an adaptive graph learning method. The proposed GNN is evaluated using CEA’s seismic data sets, where the goal is to predict epicentral latitude/longitude, hypocentral depth and event magnitude for each event. Our findings demonstrate that our algorithm not only further improves the accuracy of yield estimation, with an average MSE reduction of 10% compared to the methods that are agnostic to graphs, but also effectively exploits the hidden correlations of the nodes.
The CTBTO releases the Standard Events Lists (SELs) with information about the location, magnitude, time and depth of events identified from the automatic analysis of waveform data (seismic, hydroacoustic and infrasound). This automatic generated list of events could be evaluated using machine learning methods. Machine learning models work in two phases; training and testing. The training phase uses a dataset (released by the CTBTO) with events already classified in two categories; "TRUE" and "FALSE". The training phase uses features and patterns present into the dataset. In the testing phase, the already trained machine learning models are used for predicting/classifying new released (SLEs) events. Several machine learning models were created using different machine learning algorithms. The accuracy and efficiency of classifying SELs events was evaluated using k-nearest neighbors, support vector machine, decision tree learning, Naïve Bayes, random forest and linear regression.
Seismic signal detection and phase arrival picking had been initially carried out manually by qualified analysts. Currently, the introduction of large digital seismic monitoring networks has led to the necessity of automatic detection and picking tasks. The latter are extremely important, not only because an earthquake or a nuclear explosion must be detected and located automatically, but also to optimize the necessary storage memory. Moreover, an automatic picking task can considerably reduce the analyst effort and make picking faster and more objective. The need for automatic seismic signal detection and picking algorithms has lead many researchers to investigate various techniques, ranging from simple to sophisticated procedures. Each procedure has advantages and disadvantages. The choice of an appropriate algorithm depends on the performance required and the type of the expected signal (repeating sources, low/high SNR, emergent, impulsive). The aim of this study is to discuss the most popular and frequently employed automatic detection and picking algorithms.
NET-VISA is a Physics-Based Generative Model of global scale seismology. The model includes a description of the generation of events which include underwater and atmospheric events, the propagation of waveform energy from the events in multiple phases, and the detection or mis-detection of these phases at the network of stations maintained by the International Monitoring System as well as a model of noise processes at these stations. The model and its associated inference algorithm has been deployed by the International Data Center to generate a bulletin of events known as VSEL3. We recently introduced the ability in NET-VISA to estimate the origin error by using the model directly rather than by relying on other software such as Libloc (a derivative of LocSat). In this work we show that by injecting noise in the model training we get error ellipses that are much more representative of the location uncertainty. We have evaluated our work on a ground truth data set to confirm that the uncertainty estimates are much better calibrated than the existing Libloc software as well as the NET-VISA model without noise injection.
NET-VISA is a Physics-Based Generative Model of global scale seismology. The model includes a description of the generation of events which include underwater and atmospheric events, the propagation of waveform energy from the events in multiple phases, and the detection or mis-detection of these phases at the network of stations maintained by the International Monitoring System as well as a model of noise processes at these stations. The model and its associated inference algorithm has been deployed by the International Data Center to generate a bulletin of events known as VSEL3. In this work we study the effect of introducing the Event Definition Criteria (EDC) and a uniform location prior in the NET-VISA model. The introduction of EDC causes the bulletin to produce far fewer events, even less than the current SEL3 bulletin, while having higher overlap with LEB and REB. However, the overlap is less than the current NET-VISA model. Introducing uniform location causes the NET-VISA bulletin to produce many more events and a higher overlap with LEB and REB, but at the cost of very many spurious events. Introducing both EDC and uniform location produces results that are similar but worse than the current NET-VISA model.
An effective response to the threats and challenges of the 21st century can only be achieved through the coordinated efforts of the international community, through the UN and international law. Although the role and significance of the CTBT cannot be overestimated, this role has been hampered due to the inability of the CTBT to enter into force, because ratification by eight Annex 2 states is still required. Of these, three States are yet to sign the Treaty. The current conflicts can jeopardize all historical gains, resulting in another nuclear weapon arms race. Several efforts have been made to get these Annex 2 States to sign and/or ratify the Treaty. Some efforts are through diplomatic means and lobby, but these seems not to be very effective. This work reviews some of these efforts and challenges and propose effective solutions, through engagements of all stakeholders in the affected sates, including think tanks, NGOs and Government officials. I propose that a “Conference on the Entry into force of the CTBT” should be organized bi-annually until the CTBT enters into force. The venue should be rotated between the remaining eight Annex 2 states, to raise awareness and directly engage all stakeholders.
Nuclear justice associated with the humanitarian consequences of nuclear explosions has become a prominent issue in the broader nuclear weapons discussion since the 2010 NPT Review Conference when States Parties expressed their “deep concern at the catastrophic humanitarian consequences of any use of nuclear weapons." Notably, the term “nuclear justice” within this work exclusively refers to the various forms of redress of communities affected by nuclear testing as well as environmental remediation. Non-nuclear weapon States have formed the Humanitarian Initiative, the latest joint statement of which has been supported by 159 States. Moreover, built upon the Humanitarian Initiative, the Treaty on the Prohibition of Nuclear Weapons (TPNW) entered into force in 2021. Nevertheless, other mechanisms exist in international and national law and practice forming a broader architecture of response to such humanitarian crises, and CTBTO is part of this architecture. In the research, I will explore how effective the existing instruments are and which have been successful in the cases of nuclear testing in the Marshall Islands and Semipalatinsk. The goals of this research are to identify how these cases have been addressed; what can be learned from the previous practices; what else can be done considering new emerging legal norms and what role CTBT(O) can play in this process.
Studies on the CTBT universalization may benefit from understanding the social dimension of treaty ratification beyond techno-scientific endeavors. Within the emerging academic debate, the role of epistemic community, understood as a network of socially recognized experts in a highly technical issue, with the authority to translate scientific knowledge and social practice to certain policy outcomes, is worth considering thoroughly. This study aims at: (1) mapping knowledge production relationship involving CTBTO affiliated epistemic communities and other domestic constituents (wider people, groups and organizations) in ratifying countries (2) examining the context in which such relationships foster social learning and strengthen interests in ratification by the Annex II governments. Bringing knowledge co-production as an analytical framework aligned to the above objectives, this study examines how epistemic communities in selected states parties condition the wider cognitive and social bases of policy knowledge conducive to treaty ratification. A qualitative analysis of knowledge co-production identifies epistemic community’s roles in knowledge framing, practice, accumulation, dissemination, as well as policy and public engagement. This study argues that as CTBT becomes a global knowledge undertaking, the role of epistemic community has extended from merely providing techno-scientific bases of policy certainty to navigating the broader sociological interactions through science-policy-public interfaces.
Indonesia’s active involvement in nuclear politics can be traced back to the preparation committee for the establishment of the International Atomic Energy Agency (IAEA). During its first decade of Independence, the Government of Indonesia (GoI) elaborated its international politics of non-block by becoming a member of IAEA notwithstanding the high cost effect of the membership. Supporting IAEA mission in accelerating the contribution of nuclear technology for peace, health, and prosperity of the world, GoI also established national atomic energy in 1958, a daring decision at the time. Albeit the slow pace of nuclear energy development at the national level, GoI has shown its commitment to creating a world without nuclear weapons by ratifying the Comprehensive Nuclear-Test-Ban Treaty (CTBT) in 2011, though it has not yet entered into force. As the energy crisis has become a global scale problem nowadays, tardiness of nuclear power plant utilization has created a dilemma in energy resilience and sustainability in Indonesia. This paper aims to analyse the legal rationale behind Indonesia’s ratification of CTBT and the socioeconomic challenges affected. Further, this paper will explore geopolitical perspective of Indonesia’s commitment to CTBT and the best available strategy should be imposed to harmonize this commitment with expediency.
In this continuously globalized world with its vulnerability in the event of deployment of nuclear weapons in conflicts, testing of nuclear device appears to have assumed a redefined aim far beyond the traditional conceptualization. This new function can be seen in the context of recent nuclear device tests by Democratic People Republic of Korea (DPRK). Nuclear device test scholarship has been a major source of critique, theoretical intervention and analytical insight for disciplinary and trans-disciplinary knowledge. These accomplishments have relied on activist research. The paper proposes to explore knowledge production and dissemination/awareness initiatives of transnational/national research institutes, the media, civil society with respect to nuclear device test as well as diverse ‘global’ perceptions of such crises internationally. Central is the question of how “testers” view their respective publics and in turn how such public’s view and make sense and consume what is conveyed to them through nuclear testing.
E-poster session with display of each e-poster on an assigned touchscreen
The role of metrology in improving confidence in IMS measurements
Description: The purpose of this workshop event is to disseminate outcomes from the Infra-AUV project relating to the value of robust calibration practices in sensor networks and the associated practical aspects for maintaining sensor systems. The project will be producing a recommended good practice guide for establishing traceability through the on-site calibration of sensor systems as one of its outputs. The contents of this guide will be featured in this Workshop.
Engage with users of SHI NDC-in-a-Box to discuss recent developments and the future direction of development of all major components: Geotool, SeisComP and DTK.
Side event open to all participants.
We model the atmospheric transport of cosmogenic Beryllium-7 within the time period 1950-2100, to shed light on the complex interplay of atmospheric dynamics, changing atmospheric background conditions, deposition mechanisms and solar activity. Due to the short half-life of Beryllium-7 (around 53 days), the ground level concentration of this cosmogenic isotope is particularly suitable to be used as a proxy for vertical atmospheric transport, e.g. the Beryllium-7 transport in the tropospheric Hadley-Ferrel convergence zone has been suggested to be useful for the forecast of large weather phenomena, such as a monsoon (Terzi et al., 2019). Our modelling approach with the chemistry-climate model (CCM) EMAC (ECHAM/MESSy Atmospheric Chemistry) incorporates the production of Beryllium-7 from galactic cosmic rays (GCR), full three-dimensional atmospheric dynamics from the mesosphere down to the Earth’s surface, as well as different deposition mechanisms such as dry deposition, wet deposition and sedimentation. The results of our simulations will be compared to data on the ground level concentration of Beryllium-7 from the Comprehensive Nuclear-Test-Ban Treaty Organization radionuclide monitoring network to evaluate our simulations for different latitudes and longitudes and give deeper insight into the atmospheric dynamics preceding the data signals near the ground.
Simulations of atmospheric transport and dispersion have been demonstrated to benefit from a multiscale modeling approach that resolves both mesoscale meteorology, such as frontal passages, and microscale meteorology near the plume source, which can be heavily influenced by complex (i.e. mountainous) terrain. The atmospheric modeling community has yet to settle on recommended best practices when configuring multiscale models, because of the inherent need for a comprehensive and varied suite of observations associated with accompanying modeling studies. The recent METEX21 observational field campaign included the controlled generation and monitoring of plumes in a region of complex terrain, including spatiotemporally dense observations of meteorology within the atmospheric boundary layer. Analysis of multiscale simulations of transport and dispersion during METEX21 are an important step towards developing best practices for future multiscale modeling studies, evaluating recent model developments, and improving the accuracy of transport and dispersion simulations over complex terrain.
There are two CTBT infrasound stations in Chilean territory, I13Cl and I14Cl, located respectively on Easter Island and the Juan Fernandez Archipelago. During the COVID-19 pandemic, a health alert was declared by the government to reduce the movement of people in Chile and thus avoid the spread of the virus to the islands. Due to this, a reduction in the local noise levels detected in these stations for this period can be perceived. One of the main causes of this difference is that there was a significant reduction in airport activity, particularly on Easter Island. The island was closed to tourism and therefore, entry was restricted. Supply flights were authorized for the delivery of basic goods. The situation on Juan Fernandez Island was different, flights to the island were more frequent compared to Easter Island, so for I14CL much more signals and noise are detected than at I13CL. By obtaining some of the flight records from the airlines, it is possible to compare the dates of these flights with the noise indices of the years before and during the pandemic and visualize the difference of the impact of tourism to the data of this technology in both cases.
Observational data on atmospheric-pressure variations at the land surface have been obtained at the network of four microbarographs located in the Moscow region and processed. The analysis of data has made it possible to determine the characteristics (coherence, azimuths, and propagation velocities) of the basic arrivals of acoustic gravity waves from the atmospheric storms within a wavelength range of a few to hundreds of kilometers. The tendency for an increase in the amplitudes of pressure jumps before the atmospheric storms with increasing the amplitudes of pressure variations in the wave precursors of the front arrival is clearly seen.
The Earth is continuously under the influence of solar radiation, which interacts with the Earth's magnetic field, causing various effects. One of them is the warming of the planet, keeping it in a suitable situation for the development and maintenance of life on Earth. Another is the bombardment of the Earth by plasma from solar storms, where the magnetic field acts as a protective shield that deflects this radiation. However, in the polar regions the penetration of the storms products is responsible for the generation of aurora borealis, formed in the northern hemisphere, and aurora australis, formed in the southern hemisphere, which can be recorded by infrasound stations of the International Monitoring System (IMS) of nuclear tests. From this perspective, the South American region stands out due to the presence of the South Atlantic Magnetic Anomaly, where the magnetic field is less intense, making this region more susceptible to the effects of geomagnetic storms. In this research, we study the relationship between geomagnetic storm observation data and infrasound signals detected by the IMS stations located in South America.
A series of atmospheric transport experiments is being conducted to collect tracer data that will allow the refinement of meteorological models in complex terrain at short distances. Radiotracers were used to measure complex terrain flow features that influence diffusion and transport. Data were collected using an array of 22 real time radiation sensors dispersed over a 5 km region. Each sensor consists of a 5x10x40 cm NaI(Tl) crystal attached to a photomultiplier tube with pulse height information collected by a digital tube base. The sensors are protected from the environment by a case and mounted vertically on a tripod approximately one meter off the ground. Each sensor is co-located with a separate environmental enclosure containing a data acquisition computer, a cellular modem for communications, and a battery for power. The sensors are synchronized to UTC via a time server and gamma ray spectra are collected for 20 seconds and then transmitted via cellular network for cloud based storage. A real time display has also been written that it shows the total counts and counts in a region of interest for each sensor. The system has been recently field tested and preliminary results will be shown.
Wet scavenging is a vital process in atmospheric transport modelling to determine the distribution of masses. Therefore, using precipitation fields from the European Centre for Medium-Range Weather Forecast (ECMWF) data sets, which refer to a temporal integral rather than being a point value in time, as all other parameters in the Lagrangian dispersion model FLEXPART, is distorting the results by smoothing or shifting precipitation into dry periods. A new disaggregation scheme preparing precipitation rates as point values has already been implemented in the pre-processor tool flex_extractv7.1.2, which prepares ECMWF data for use in FLEXPART. Consistency, continuity and mass conservation of precipitation within each time interval is now secured. In combination with the newly added temporal interpolation of all scavenging-related meteorological fields in FLEXPART, the results show a substantial improvement in multiple case studies. The first case highlights the effects with a high resolution output grid that demonstrates that artificial checkerboard and banded structures present in the output from previous algorithms have disappeared. Further evaluations show the improvement of wet deposition results by comparing against measurements and previous case studies, such as the lifetime analysis of aerosol particles and the transport of mineral dust and black carbon.
Atmospheric transport modelling (ATM) tracks the movement of released substances into the atmosphere. Dispersion of freshly emitted releases takes time to evolve, and the initial transport pathway depends strongly on surrounding conditions, such as wind patterns. Global ATM with reasonable coarse resolutions can usually reflect synoptic patterns, such as frontal systems. However, meteorological scales are even more important when considering complex topography. They have their own local wind patterns, superimposed on large scale winds. Small scale features like convective vertical transport or transport by land-sea breezes are not fully resolved. Mountains are smoothed on coarser grids and cause pathways to be distorted. Transport pathways of emissions from source locations in complex terrain are, among other things, strongly influenced by such small scale features. Therefore, global ATM simulations might not always be able to estimate sufficiently accurate activity concentrations at International Monitoring System (IMS) stations in the proximity of these source locations for confirming a radioisotope to be above or below the detection limit. This presentation uses high resolution ATM to investigate the consistency between ATM simulations and IMS measurements for such cases. Results will demonstrate the higher quality of high resolution ATM and how the accuracy of transport pathways increases for higher resolution in meteorological input.
In recent years, high altitude floating platforms with a microbarometer payload have been utilized towards infrasound detection and source characterization. The stratospheric locale is presumed to be less noisy, thus facilitating better signal detection compared to ground based sensors. High altitude sensing platforms are also considered the future of space exploration for extraterrestrial worlds with harsh atmospheres and lack of surfaces. A high altitude balloon carrying a sensor payload was launched in the early morning on 10 July 2020, with the aim to capture infrasound generated by a series of three controlled ground explosion experiments carried out in New Mexico, USA. All three events were detected. During the first two events, the balloon was in the close proximity of the explosion epicenter (<50 km), well within the acoustic zone. However, at the onset of airwave arrival from the third event, the balloon was at the edge of the acoustic shadow zone, where no signal detection is predicted. Gravity wave induced small scale structures in the atmosphere are known to have a notable influence on infrasound propagation. We discuss this effect in the context of high altitude infrasound sensing and event characterization at regional distances. SNL is managed and operated by NTESS under DOE NNSA contract DE-NA0003525.
The Lagrangian atmospheric transport model FLEXPART is employed to investigate a broad spectrum of applications, including the transport of radioactive material emitted by nuclear events. Since its inception in 1998, FLEXPART has undergone many changes, with its last official release (version 10.4) published in 2019. At the same time, numerous versions have been developed across institutions to cater for specific needs. To make it easier to modify FLEXPART, while not having to diverge from the main version and its updates, we introduce a more modular way of organizing the source code. Instead of the distributed-memory parallelization with MPI used in version 10, now OpenMP parallelization is applied where possible, resulting in a reasonable scaling behaviour. For example, a run with a single noble gas release of one million computational particles scales almost perfectly up to 32 cores. Accuracy is improved by advection of particles by the mean wind on the original model levels, thus avoiding additional interpolations. This results in a reduction of absolute transport conservation errors of potential vorticity in the stratosphere by ~0.3% each time-step. Finally, a range of new user features have been added, e.g. printing selected particle properties to NetCDF files.
Beryllium-10, as measured in polar ice cores with an annual resolution, is a proxy for long term cosmic ray variability. Beryllium-10 concentration can not be used directly as a proxy for solar behavior since the signal can be distorted by atmospheric transport and deposition processes. The data on atmospheric Be-10 concentrations are rather scarce due to the laborious nature of Be-10 measurements. Beryllium-7 is also a cosmogenic isotope with a half-life of 54 days and is commonly observed in particulate airborne radioactivity monitoring. The transport of Be-7 can be modeled with high accuracy using the known meteorological fields. This study used Chemistry-Climate model Solar-Climate Ozone Links coupled with the second versions of the aerosol transport model and the beryllium production model in atmosphere (CCM SOCOL-AERv2-Be) to model the formation, transport and the removal of beryllium isotopes from the atmosphere. The model results were compared with the measured concentrations observed in Finland, Canada, Chile and in Kerguelen Island in the Indian Ocean from 2002 to 2008. The data for Kerguelen Island and Chile were downloaded from the CTBTO vDEC database. The modelled and the measured concentrations were in good agreement in all four locations proving the validity of the CCM SOCOL-AERv2-BE model.
Atmospheric transport models (ATMs) are used to model the transport of radionuclides both to determine the origins of unknown releases and to model the background concentrations from known sources. To do this ATMs rely on meteorological information from four-dimensional numerical weather prediction (NWP) models. However, the chaotic nature of the atmosphere means that the meteorological information provided by these models is uncertain. Therefore, meteorological experts are increasingly running ensemble NWPs to produce probabilistic weather forecasts. A number of studies have been carried out coupling ensemble NWPs with ATMs but to date few centres produce ensemble ATM output on an operational basis (i.e. on demand in response to incidents). This work will present an approach to using ensemble NWP data using stack measurements and observations from Qb sensors from the Xenon Environmental Monitoring at Hartlepool (XENAH) collaboration to demonstrate both the benefits and challenges of coupling ensemble NWPs with ATMs in an operational setting.
Infrasound technology has been employed to investigate acoustic signatures of severe weather events in the past and this study aims at characterizing, for the first time, the infrasound detections that can be related to Mediterranean hurricanes (Medicanes). These mesocyclones pose a serious threat to coastal infrastructures and lives because of their strong winds and intense rainfalls. This work contributes to infrasound source discrimination efforts in the context of the Comprehensive Nuclear-Test-Ban Treaty. We use data from the infrasound station IS48 of the International Monitoring System, in Tunisia, to investigate infrasound signatures of medicanes using a multichannel correlation algorithm. We corroborate the detections by considering satellite observations, a surface lightning detection network, and products mapping the simulated intensity of the swell. Detections are evidenced at distances ranging between 250 and 1100 km, between 0.1 Hz and 8 Hz. Deep convective systems, and mostly lightning within those, seem to be the main source of detections above 1 Hz. Hotspots of swell (microbarom) related to the medicanes are evidenced between 0.1 and 0.5 Hz. Multisource situations are highlighted, stressing the need for more resilient detection estimation algorithms.
Infrasound propagation is mainly driven by the seasonal changes in stratospheric winds. However, small scale perturbations like gravity waves also affect the detection capability of the infrasound station network of the International Monitoring System. Simulations of atmospheric models explicitly resolving gravity waves are used to investigate the effect of these perturbations on infrasound transmission losses using parabolic equation simulations. We use high resolution atmospheric specification fields obtained in the framework of Dynamics of the Atmospheric General Circulation Modeled on Nonhydrostatic Domains (DYAMOND). DYAMOND is an international project, initiated by the Max Planck Institute for Meteorology and the University of Tokyo. It describes a framework for the intercomparison of high resolution global models. It mainly focuses on the troposphere, but some models extend well into the stratosphere. Lidar observations are used to validate the model at Observatoire de Haute Provence (France). By filtering out small scale perturbations (gravity waves) in atmospheric specifications and comparing parabolic equation simulations with and without gravity waves, respectively, we quantify the impact of gravity waves and discuss how it is related to the gravity wave activity (energy) and the mean atmospheric waveguide, across the International Monitoring System.
The shape of radioactive particles varies significantly, 'from compact small-sized crystalline single particles to large amorphous aggregates' (Salbu and Lind, 2005). However, many atmospheric transport models assume perfect spheres. Since non-spherical particles experience a larger drag in the atmosphere, i.e. reduced gravitational settling, the atmospheric residence times and transport distances will be underestimated by these models. In this study, we present a new settling scheme in the Lagrangian dispersion model FLEXPART, which considers particles of different non-spherical shapes and orientations. The scheme is based on the drag coefficient prediction model of Bagheri and Bonadonna (2016) and was tested experimentally by printing particles of various shapes in the size range 50-300 $\mu$m (volume equivalent diameter), which were then released in a settling column to determine their settling velocities. We show that the shape correction can extend atmospheric lifetime of non-spherical particles substantially as compared to spheres. The new version of FLEXPART gives the opportunity to prescribe the particle geometry and orientation of falling. These options allow to reduce the uncertainty regarding particle shape and conduct more accurate model simulations of atmospheric concentrations and deposition patterns after potential weapons tests and/or nuclear accidents.
The use of aerosol optical depth (AOD) has been proven as an alternative to the traditional ground level monitoring of air quality in many countries across the world. Therefore, this study based on MERRA-2 data aims: (i) to characterize the spatiotemporal and component variations of aerosols in the atmosphere over the capital cities (Luanda, Sumbe, Benguela, Huambo and Lubango) of the five most densely populated provinces of Angola from 2010 to 2020 and (ii) to assess the influence of emissions from the Nyamuragira volcano (Democratic Republic of Congo) on the air quality at the five cities. The most significant contribution to the total AOD was derived from organic carbon, in all the cities, whereby the highest values (0.19 - 0.23) were in Luanda. Ranges of sulphates across the coastal cities were higher when compared to the interior cities caused by the emissions inventory data. The HYSPLIT model showed that air masses from Nyamuragira at various heights in November 2011 reached Luanda and Sumbe, and CALIPSO could confirm the existence of volcanic aerosols in this same period. This study allowed to conclude that the variability of AOD loading depends on seasons and regions, thus providing a little more information about the matter.
Vertical profiles of particulate aerosols and meteorological fields were obtained before, during, and after the invasion of regional pollutants over the mid-hill Trishuli and Sunkoshi River valleys that encroach deep into the Central Nepal Himalayas. Particulate aerosols and met-sensors integrated unmanned aerial vehicle was deployed for the purpose. Probing up to 500 m above the ground, more than 350 such vertical profiles were obtained continuously at 45 minutes intervals from 23 March to 6 April 2022. During the period, the Kathmandu Valley, lying in between the two valleys, witnessed an air pollution episode. In this paper, we will present the observed meteorological and particulate profiles over the valleys and discuss the transport and source reconstructions of regionally invaded particulates over the mid-hills of the Central Nepal Himalayan region.
Microbarometer networks are in place for the detection of atmospheric infrasound waves and verification of the CTBT. The presence of turbulence and other wind-induced effects are considered a nuisance. For this reason, wind noise filters are typically in place for suppression. In this study, we establish a relation between microbarometer observations and in situ turbulence measurements at the Cabauw Atmospheric Research Site in The Netherlands. Using this relation, we compare noise levels from the Dutch microbarometer network to turbulent pressure predictions from the high resolution HARMONIE weather model. This approach has two foreseen applications: (1) modeled turbulence fields could possibly help in identifying regions that are most appropriate for infrasound monitoring and (2) microbarometer observations could possibly be of use in the further refining of sub-grid scale turbulence schemes in weather models.
Explosive volcanic eruptions produce powerful infrasound that can propagate thousands of kilometers in atmospheric waveguides. The International Monitoring System (IMS) infrasound network has now captured numerous acoustic signals from explosive volcanic eruptions. Remote infrasound has proven useful to locate and characterize subaerial volcanic source parameters (e.g. eruption chronology and timing), but signals are subject to spatiotemporal atmospheric variability and detectability. We developed a methodology that aims to improve rapid automated global source detection, location, and characterization with a first-order approach based on empirical climatologies (HWM14/NRMSIS2.0) and 3-D ray-tracing (infraGA). Using a brute-force computational approach, we tabulate corrections that represent predictions of the atmospheric effects (e.g. azimuth deviation) on the propagation raypaths for volcanic signals to the IMS infrasound stations. We test our methodology on the energetic eruptions of Puyehue-Cordón Caulle 2011 (Chile) and Calbuco 2015 (Chile), as well as the smaller, nearly continuous eruptions of Mount Michael (South Sandwich Islands), and the most active volcanoes of the Vanuatu Archipelago. We obtain source location improvements for individual events and enable improved signal identification of repetitive volcanic infrasound with year-long azimuth deviation predictions. Our methodology could be easily extended for all IMS infrasound stations for near real time monitoring.
In February 2021, the Ukrainian small aperture infrasound array began recording in the Antarctic Peninsula area. This is a good step towards ensuring the implementation of the CTBT with national resources. However, only at the beginning of 2022 was it possible to organize continuous transmission of satellite data in real time. This made it possible to quickly track both regional events that took place in the area of the Vernadsky station and global ones, such as the eruption of the Hunga volcano, Tonga. The infrasound station has become an excellent addition to the set of multi-profile geophysical equipment available at the Vernadsky Antarctic station, expanding the range of use of the entire complex. It should be noted that the planned installation of the International Monitoring System infrasound array near the Antarctic Peninsula (Palmer station) has not yet taken place. Therefore, the data for a year of observations of the infrasound array at Vernadsky station can also be useful for understanding the conditions for registering infrasound in the region.
This paper showcases the work done on the designing and the making of a bespoke readout circuit that reads and produces the measurements of a unique imaging system used to assay nuclear materials and hidden neutron and gamma emitting radionuclides. The system in itself uses neutron and Compton scattering simultaneously within three layers of detectors all baked with an 8x8 silicon photomultiplier (SiPM) array. Each pixel in the SiPM is read individually which gives a total number of 192 channels that requires a dedicated circuit to read and produce a meaningful signal. The work on the readout circuit has successfully produced an imaging system that offers a fast scan time of 60 seconds and a data processing time of less than 60 seconds. The device has been tested with 300 kBq Cs-137 gamma sources and a 1 MBq Cf-252 neutron sources in close proximity. Experiments conducted in the Faculty of Science and Technology at Lancaster University (UK) indicated that the system can detect and localize both gamma rays and neutron sources with intrinsic efficiencies in the order of 10E−4. Further upgrading of the readout circuit is carried out as a joint research with the Physics Department at Sultan Qaboos University.
Based on the lessons learned during On-site Inspection (OSI) workshops; training courses and exercises, especially the Integrated Field Exercises (IFE); remotely controlled ground platform based geophysics detection systems, due to the practical application to hazard challenging conditions, would have their significance both for future real OSI activities and training courses of current stage. This work has put forward a remotely controlled ground penetrating radar (GPR) robot based on a commercial off the shelf GPR system demonstrated during SnT20021. The remotely controlled platform applies integrated-inertial /GNSS/high-precision-RTK positioning and navigation, so as to locate the position of GPR in real time. At the same time, advanced motion control algorithms have been developed to achieve the automatic movement of the platform once the scope and spacing of detection have been input into the control system. Meanwhile, the 2-D and 3-D GPR data acquisition could be achieved and integrated with the RTK positioning data. 2-D, 3-D and horizontal scanning data displaying mode could increase the target recognition accuracy and reduce the false alarm and target missing rate. This system has been tested and verified in the field and can be suitable for the development of concepts of operations during future IFEs and OSI training courses.
The Transient ElectroMagnetic Method (TEM) technique is one of the geophysical techniques used in on-site inspection. This work focuses on the mathematical and numerical modeling of this technique.
The numerical simulation made it possible to predict the response of the sensor. With the treatment of several underground configurations, the expert can acquire a good experience of interpretation, which is very difficult in the case of a real situation.
This work consists in modeling the TEM. The system consists of a single dual-function sensor (transmission and reception). The method consists in supplying a conductive loop with direct electric current. Once the current is established, a sudden cancellation of this current is performed. The induced currents are injected by induction into the stratified underground medium. The voltage produced by the induced currents provides information on this environment. To model this system, we used the finite element method. A characterization of the stratified media was carried out by measuring the response of the sensor after the cancellation of the current pulse, then an inversion model using an optimization method based on genetic algorithms is presented.
Satellite imagery is a powerful tool to monitor compliance of States Parties with the Comprehensive Nuclear-Test Ban Treaty (CTBT). Satellite imagery can be used by the States Parties as one of the national technical means of CTBT verification. Moderate to high spatial resolution satellite imagery is useful to investigate areas where a nuclear test may have taken place, as primary indicated by the data of the seismic network. In this study, recent Sentinel-2 data and historic Landsat images of a number of sites of underground nuclear tests that have been carried out by USA, ex-USSR, France, China, India, Pakistan, and People's Republic of Korea are analysed. The multivariate change detection algorithm, imagery animation and other visualization techniques were used to analyse the data. The moderate spatial resolution imagery can accurately detect areas of surface disturbance due to the nuclear tests, especially for tests carried out in desert regions. However, low yield tests may not produce surface expression detectable by the satellite imagery. In these cases, high spatial resolution imagery is useful to monitor activities that may be related to a nuclear testing event.
The on-site inspection (OSI) is the final component of the Comprehensive Nuclear-Test-Ban-Treaty (CTBT) verification regime that will help determine whether a nuclear explosion occurred; facts may also be gathered to determine who was responsible for the Treaty violation. The techniques include ground based, airborne, and laboratory elements. Because technologies involving some of the methods addressed in CTBT/PTS/INF.1573, which includes all OSI techniques identified under Part II of the Protocol to the CTBT, except drilling, have evolved over time, the goal of this work is to assist the Provisional Technical Secretariat in its efforts to refine the permitted equipment for OSI in order to quickly adjust the identified items on the list that have become obsolete. The review was conducted using all available documents, including the Treaty, the draft OSI Operational Manual, online discussions with field experts, OSI Workshops, etc.. As a result, some geophysical techniques such as, electrical conductivity measurements, passive and active seismic surveys and over-flight, have been reviewed and recommendations made. It is hoped that this effort will benefit the Provisional Technical Secretariat and OSI's activities for effective verification.
With the development of commercial space technology, remote sensing for instance has been widely used in daily life. The availability of remote sensing images as commercially off the shelf products makes it possible for on-site inspection (OSI) application, especially for training courses and exercises. This work would carry out a commercial satellite imagery based information integration system solution to OSI. It would provide for OSI training courses and exercises customized product solution. According to the requirement of OSI inspection area, open source commercial satellite images would be acquired and processed consequently. Based on the image and DEM data, an integrated information system would be setup to provide high resolution real circumstances and three-dimensional modeling simulation of the inspection area. Based on the integrated information system, most of the IT functionalities such as mission planning and decision making, route optimization, data fusion and visualization, managed access arrangement, over-flight planning, BoO and H&S arrangement, and etc.can be supported. With communication equipment assistance, services of the integrated information system can be shared by individual inspectors in the field in real time. The working team is looking forward to providing a test trial at the future IFE2025.
The objective of this paper is to demonstrate the use of the magnetic method in the on-site inspection (OSI) regime and try to find the best magnetometer for OSI applications. It will be used mainly to locate underground infrastructure. Theoretical and practical applications are used to give insight to the qualitative and quantitative information obtained from magnetic measurements. Also, the advantages and limitations of the different types of magnetometers will be shown by presenting a comparison between different magnetometers that are based on different working principles in terms of the application and the survey results. Addressing general limitations such as the effect of object shape and orientation and some interpretation challenges such as the interpretation of merged anomalies. Survey technique privileges, such as gradient and vertical component gradient using the proton magnetometer versus fluxgate magnetometer which is a vector magnetometer are presented along with the measurement using overhauser, Cesium and Potassium magnetometers. Several survey configurations are also discussed such as total, gradient, backpack, and towed configurations. Practical limitations, such as survey speed, maximum observed intensity, physical effort, and processing time are addressed. The method's capability to resolve buried targets with various amount of magnetic materials and depth are presented using several case studies over buried pipes.
The local active fault in Bali has a small magnitude (M<5) but has a destructive potential. This study used gravity data from GGMplus, topographic data from DEMNAS, and lineaments using ALOS-2 PALSAR-2 data. To identify the fault movement, we interpret the subsurface using the gravity derivative method. Identification of fault locations using lineament extraction from SAR data has been done by directional filters. The composite image red-green-blue for HH, HV, and VV polarization, respectively was used for automatic lineament extraction which was then manually corrected. The results of the gravity method succeeded in identifying 29 of the 30 faults from the geological map of the Bali sheet and new fault from PALSAR-2. Bali has 12 thrust faults, 11 strikeslip faults and six normal faults. The image of PALSAR-2 has succeeded in making a fault lineament map for the Bali region. The lineament extraction results from PALSAR-2 obtained four new faults (Pesanggaran, Sepang, Tegal Badeng, and Banyuwedang) while there were four faults that were not identified (Tampaksiring Fault, Plaga, Mambal, and Munduk-Rajasa). We propose 30 faults in Bali including 26 faults from geological maps with changes in length and location shift and four new faults extracted from automatic straightness.
The use of remotely controlled platforms (RCP) during an on-site inspection (OSI) is an open topic, especially regarding aerial platforms. Such devices did not exist, or at least were not commercially available, when the Treaty was opened for signature and ratification, and therefore there is an ongoing debate regarding the conduct of inspection activities and health and safety related activities using instruments carried by a RCP. Our approach is to consider such activities as an expansion of ground surveys. Conducting a survey using OSI approved equipment deployed on a RCP operating at two meters ground clearance is no different from a ground survey conducted by an inspector carrying the same sensor on a pole at the same height. In addition, surveys involving RCP are more efficient and effective, and represent a sound alternative to walking surveys in case of safety concerns. This contribution presents the conclusions drawn from a desk study on the use of RCP for OSI purposes, as well as the lessons identified during the conduct of OSI-relevant near surface surveys as part of a recent field campaign in Italy.
The importance of characterizing the amplitude and spatial variation of the geophysical anomalies created by on-site inspection (OSI) relevant observables is well known. By understanding such characteristics, it is possible to design the different geophysical surveys and to realistically assess the capability of a certain OSI technique to detect a potential observable. As part of the Action Plan 2016-2019, the Provisional Technical Secretariat conducted a multi-component project that included the development of forward models to characterize the magnetic anomalies created by complex geometric bodies simulating different OSI-relevant observables. The models were generated using the GamField software package developed by the Italian National Institute for Geophysics and Vulcanology (INGV). As a follow up of this project, a series of multilevel surveys (ground, near surface and airborne) were conducted in 2022 over a relevant area in central Italy. This contribution presents the results from both the theoretical forward modeling effort as well as from the field surveys, and the conclusions reached regarding the potential use of the magnetic field mapping in an OSI context.
A gas sampler system has been designed, built, and deployed that may contribute to the near-field detection of underground nuclear explosions. The system is inexpensive, portable, and autonomously collects samples for subsequent lab analysis. Collection duration and interval parameters can be adjusted using Wi-Fi or other communications network based on the intended collection site. Radioactive gas samples can be analysed in- or near-field with 2’’x4’’x16’’ NaI(Tl) detectors, or analysed off-site with a xenon separation and analysis system. To facilitate near-field analysis and improve detection limits for radioxenon, samples are filtered through charcoal traps which are then counted on a NaI(Tl) detector. The minimum detectable activity for 127Xe and 133m, 133Xe in charcoal traps is 4.5 Bq, resulting in a minimum detectable concentration of 150 Bq/m3. Future iterations of the gas sampler will contain one sample bottle that autonomously filters samples through charcoal traps for real-time measurements. This design provides the possibility for the gas samplers to aid in radionuclide measurements in an on-site inspection. Deployment of these samplers is modeled in a companion presentation titled “Modeling the use of mobile modular gas samplers in near-field detection using HYSPLIT”.
The Wireless Independent Noble Gas Sampler (WINGS) is a mobile, modular gas sampling system designed for use in low infrastructure environments. WINGS units operate on battery power and communicate by radio connection. In the case of a suspected underground nuclear explosion (UNE), WINGS units could be deployed to detect and identify noble gases emanating from the explosion site or identify gases entering the local area from offsite. This work uses the atmospheric transport modeling tool inline WRF-HYSPLIT to determine the ideal deployment configuration for WINGS units around a local area using a test case with a hypothetical upwind medical isotope production facility providing a nuisance background. HYSPLIT is a computer model for atmospheric transport and dispersion. Inline WRF-HYSPLIT integrates data from the WRF-ARW meteorological model, generating dispersion and deposition models with finer spatial resolution than achievable with an offline approach. These more accurate models enable determination of sampler network effectiveness as a function of sampler density, sample collection duration and interval, and distance from the emission point from a UNE.
Results of a geothermal research project aimed to model structural controls of a geothermal reservoir located in the Kenyan Rift Valley will be presented. The geothermal reservoir model developed through several modelling software, using reservoir parameters like rock types, permeability, porosity and pressure distribution obtained during exploration and drilling will be based on a wide range of geophysical, geological, geochemical and geospatial data. The results of the geothermal project will be presented, drawing parallels to the planning and execution of a drilling proposal as a part of an on-site inspection (OSI). Based on the findings of an OSI, the inspection team may propose drilling. To ensure the reaching of the drilling targets in a time-sensitive manner, and with safe drilling operations, the drilling plan is expected to be based on extensive surface and subsurface data acquisition and interpretation. Based on the presented geothermal reservoir modelling approach, recommendations will be made for what should be taken into account in the preparation of the OSI drilling proposal.
A Field Test of on-site inspection (OSI) geophysical techniques for deep applications was conducted by the Provisional Technical Secretariat in September 2022 in the Austrian Ybbstaler Alps, with the support of external experts. The scope of the Field Test was to assess the current OSI geophysical imaging capabilities for deep applications in an integrated manner in a mountainous environment with a number of deep geophysical observables of OSI interest. This was the first OSI field test in a mountainous environment and therefore a number of operational, logistical and technical challenges had to be addressed. The implemented OSI geophysical techniques – permitted by paragraphs 69(f) and 69(g) of Part II of the Protocol to the Comprehensive Nuclear-Test-Ban Treaty – included resonance seismometry, active seismic, magnetic and gravitational field mapping, as well as electrical conductivity surveys along three 2-D profiles over a cave system at 40-350 m depth. During the Field Test full OSI data workflow for geophysical techniques was tested using current functionalities within the Geospatial Information Management system for OSI (GIMO). This presentation describes the technical planning, execution as well as data processing and interpretation results of the Field Test, with future recommendations for similar, challenging mountainous environments.
In preparation of the next Integrated Field Exercise, and in line with the capabilities developed with past field exercises, action plan project and expert meetings, the section Equipment and Instrumentalization of the On-site Inspection (OSI) Division of the Provisional Technical Secretariat has entered into a new stage of development of its telemetry solution. In 2022, the existing data transmission system had been subject to a series of maintenance, upgrade and training that confirmed its potential use for OSI technique and OSI deployment. It is important to keep in mind the objectives and the limitations of such a data transmission solution while presenting the extended functionalities, the operational capabilities and the development perspectives. In 2023, a specific field test will be organized to validate the configuration and the functionality of the OSI telemetry system.
Knowledge about the natural background of xenon isotopes and Ar-37 in the top few meters of the soil column is crucial for the assessment of measurement of those isotopes in the course of an on-site inspection. There are many factors that control the Ar-37 concentration in soil air. Production and gas transport mechanisms have previously been investigated but less attention was put on the depth dependency of Ar-37 emanation. Irradiation experiments on natural soil samples have revealed changes of this important parameter by almost an order of magnitude over a depth range of 5 meters.
In the case of an underground nuclear explosion, Ar-37 and Ar-39 are produced in rocks by neutron activation of Ca and K, respectively. Because of the very different half-lives of Ar-37 and Ar-39, the Ar-37/Ar-39 ratio in the subsurface is a function of the timing of production (pulsed vs. continuous), depth-dependent production mechanisms, and the Ca/K ratio of the rocks. The Ar-37/Ar-39 ratio can be used as an indicator of whether a measured radio-argon signature from the underground originates from a recent or ancient underground nuclear explosion or is of natural origin. In this poster, we compare calculated natural Ar-37/Ar-39 production depth profiles with data obtained from several groundwater studies in different geological settings.
Forensic seismology describes the detection and interpretation of exceptional seismic events, e.g. during an on-site inspection (OSI), or signals by plane crash, submarine explosion, bomb attacks. For basic research, forensic seismology means discovery of unknown or unexpected signals, e.g. precursors of rockslides, deep earthquakes below induced seismicity, or nano-earthquakes of active faults. In any case, the challenge is in the search for a-priori unknown signal signatures, often in unknown underground at near-local scale. Nanoseismic monitoring is a unique approach for measuring, detecting and analysing seismic signals. It combines a specific small array layout of seismic stations and several software tools for detecting and analysing seismic events.
Nanoseismic monitoring works with minimum requirements for field operations, displays continuous data streams, supports processing of weak events close to ambient noise by array analysis, and allows for interactive, real time modification of underground velocity models. It is perfectly suited for seismologists operating in a new region of unknown tectonic or induced seismic activity. Nanoseismic monitoring is the standard approach for the PSM tool of OSI investigations. We will present the software with its recent update to handle topography for application in mountainous regions.
This paper presents the features of an innovative portable radioactive isotope identifier for the detection and identification of both gamma and neutron emitting radionuclides, or even multiple ones. The described portable isotope identifier is suited for the site inspection for its capabilities of detecting and characterizing nuclear or radioactive materials in presence of gamma shielding material, neutron moderators, masking gamma sources. Its singular features are the capability to identify sources through the detection of neutrons and discriminating spontaneous fission sources (Cf-252), α-n sources (Am/Be, Am/Li) and nuclear material containing mix of isotopes of plutonium or uranium. Those features are combined with the capability to make cross correlation between gamma and neutron measurements to achieve a higher level of accuracy in the identification of SNM that emits both neutrons and characteristics gammas. This paper presents the results of measurements performed in real-scenario conditions at gradually increasing levels of difficulty, adding shields, moderators, masking gamma sources and a mix of the above mentioned. The test results are also compared with international standards and the device exceeds the standard performance by triggering a neutron alarm for Cf-252 source at a five times greater distance than the one required by ANSI N42.34.
One of the most conclusive evidences of a violation of the CTBT is the presence in the subsoil air of elevated concentrations of Ar-37 radionuclide, which is formed in large quantities in the interaction of neutrons with calcium in rocks. Traditionally, to measure the activity of Ar-37, proportional gas counters are used, which are filled with a counting gas prepared from samples of argon with the addition of methane. Further reduction of the detection limit of Ar-37 is limited by the difficulty of a significant increase of argon sample volume placed in a proportional counter. Installation for the detection of Ar-37 low activities based on the liquid scintillation principle was developed at the Khlopin Radium Institute under contract with the CTBTO. The role of the scintillator in this installation is performed by the liquefied preparation of extracted from soil air argon itself. The use of liquefied argon samples allows one to multiply the volume of the measured samples without increasing the size of the measuring cell and shield elements, and allows significant reduction of detection limits of Ar-37. This presentation contains the description of installation improvements and the results obtained during its testing.
In 2021 the first layout and design of the next generation on-site inspection (OSI) field laboratory was presented (Poster 3.2 -691). It highlighted the requirements, the status and future improvements of the OSI field laboratory for the development of its capability to be rapidly deployed during a CTBT OSI. Here we present some of the improvements for the OSI field laboratory, notably for the installation of the joined two-pod laboratory component setup, insulation of the containers and results of the operation of a SAUNA system inside to process and analyse gas samples for their xenon content. Furthermore, we present the modular design of the setup of laboratory components, combining the two-pod setup with tents to streamline the various processing tasks, measurement, analysis of and reporting on particulate samples, which leads to our first overall design of the new generation OSI field laboratory proposed for future exercises.
In this study we present a method for comparing objects, identified on images acquired by the synthetic aperture radar (SAR). The method gives a quantitative value for the similarity of different objects, detected on the sea surface on SAR images. Aim is to track their changes and be able to identify the most similar pairs. As a case study we detect four rectangle-like objects north of the Bulgarian city Varna, close to the shoreline. The structures are located on the sea bottom, but are very close to the sea surface, so they become visible in favorable conditions. The aim is to identify if they change their shape over time and if they are visible differently on images from ascending and descending orbit of the satellite Sentinel-1. This is done by calculating the image moments. We compare the similarity of the objects by calculating the 7 Hu moments or invariants and comparing them.
Electro-optical satellite imagery provides analysts with opportunities to monitor suspicious nuclear activities in the restricted access area. The more remote sensing technology advances, the more resources available from both spatial and temporal resolution standpoints. Spatially, the area of interest that needs to be monitored by analysts has been detailed and accordingly increased; numerically, the acquisition of satellite imagery is available on a daily basis. Inevitably, automated or even semi-automated support for analysts is required in the field of study. This paper presents characteristics of dealing with small scale objects on high spatial resolution satellite imagery derived from conventional and cutting edge technology applications for change detection in remote sensing. On the condition that the purpose of the analysis is to counter nuclear proliferation against outlaw states, historical records of the algorithm based approaches are reviewed, showing where the outcome has been attained. Then, the effect of image to image registration, shadow, and side section due to the angle of attack is discussed, considering the size of the target objects derived from the feasibility study. The challenges of each factor are addressed with a way forward.
During search logic, the technique of environmental sampling can play a role in narrowing down the areas of interest and prioritizing the field mission. One methodology to achieve this purpose is to collect several samples from the same search zone and screen out any anomaly in the gamma spectra of the samples. To ensure a high throughput screening process it can be useful to gang the samples collected in the same search zone and to perform a single long measurements. In order to have a fixed geometry we have fabricated a multi-sample holder, with two separate layers, for the purpose of analysing simultaneously up to 13 samples with a defined IFE geometry. The geometry of the system has been simulated by Geant4 code and an experimental set up has been implemented using spiked samples put in different positions of the holder (upper and lower layers) in order to characterize this geometry and to compare the efficiency of each positions. Two different scenarios has been simulated: one with several sample containers and identical content and one with the same number of containers but with only one with spiked sample.
The International Atomic Energy Agency Department of Safeguards deployed the Geo-based Data Integration (GDI) platform for information integration, analysis, and activity planning involving geospatially-related information used for nuclear safeguards verification. GDI provides interactive, layered maps in a secure, user-friendly collaborative environment for IAEA inspectors, analysts and managers to access, utilize and share geospatially-attributable information regarding nuclear facilities, sites and other locations and activities relevant to the implementation of States’ safeguards agreements. GDI operates in the secure Integrated Safeguards Environment, and access to information in GDI is limited and controlled via the Safeguards Authorization Management system. As the IAEA is an on-site inspection agency that verifies nuclear materials and activities in physical locations in States in accordance with their safeguards agreements, nearly all safeguards-relevant information has geospatial attributes. This includes State-declared information; information collected by inspectors and instruments; and open-source information, including commercial satellite imagery.
This paper reports progress achieved with GDI since 2018. It describes enhanced functionalities; automated data integration with the Additional Protocol System; planning and reporting integration with the Integrated Scheduler and Planner and Safeguards Field Reporting and Evaluation; and effectiveness and efficiency gains through applications of GDI analytical methodology to facilities and complex nuclear fuel cycle locations.
CTBTO is hiring professionals in STEM fields – Learn about CTBTO’s Recruitment Process
Come and find out more about how to join the CTBTO! Visit us at the HR booth and speak with the CTBTO Human Resources team and learn more about CTBTO careers. You can also join us in our HR presentation where we will walk you through our recruitment process. CTBTO is hiring top talent across a wide variety of scientific and technical fields in seismic, radionuclide, hydroacoustic and infrasound technologies. There will also be a presentation focusing on students and young professionals interested in pursuing a career in these fields. We look forward to meeting you soon!
NPE - independent performance assessment by multi-technological scenarios of potential CTBT violations.
Presentations and discussion coordinated by German NDC.
Anthropogenic radionuclides have been injected into the atmosphere by nuclear weapon programmes, nuclear weapons testing, nuclear power plants and uranium mining. In this work, we considered the activity concentrations of anthropogenic radionuclides (Cs-137 and Cs-134) detected in the International Monitoring System (IMS) from January 2021 to December 2021. The main objective is to determine the activity concentrations of anthropogenic radionuclides, classify the IMS (radionuclide) according to the radioelement mapping and establish radioactivity levels in the IMS (radionuclide) as well as radiological risk assessment by using Monte Carlo simulation.
The Xenon Environmental Nuclide Analysis at Hartlepool (XENAH) collaboration involves scientists from the U.K., U.S and Sweden who are performing measurements of routine emissions from Hartlepool Power Station with cooperation of the reactor operator, EDF Energy. Three diverse and complimentary radionuclide monitoring techniques are being deployed, aiming to characterize radionuclide emissions of an operating nuclear reactor and understand how emissions from such facilities may affect the International Monitoring System. Direct measurements of radioxenon emissions at source are being collected using a stack monitoring system. Remote, stand-off measurements of radioxenon after atmospheric transport of several kilometres have been obtained using an array of stand-alone air samplers and analysers. Ultralow background measurements of environmental samples which have been collected at and near the reactor have also been analysed. The results from these measurements will provide a representative fingerprint of an operating civil nuclear facility and provide knowledge of radioxenon backgrounds close to operating facilities. Isotopic ratios of radioxenon will be calculated and compared to the discrimination line proposed by Kalinowski. The measurement effort and techniques will be described, along with the scientific questions that aim to be addressed. Preliminary results from measurements undertaken using the three complementary techniques will also be presented.
The Medical Isotope Production (MIP) by fission release radioactive noble gases into the environment.
These emissions increase the environmental background and could hinder the mission of the International Monitoring System (IMS) to detect early a nuclear explosion.
Argentina has been producing radioisotopes by fission since 1985 from the irradiation of Highly Enriched Uranium (HEU) targets. In 2002, taking into account the country's commitment with nuclear non-proliferation, the targets were changed to Low Enriched Uranium (LEU), becoming the first country in the world to achieve this goal.
Fission radioisotope production processes that dissolve targets in a basic medium generate two emission streams, air and hydrogen. Both must be treated separately.
There are a variety of devices to reduce radioactive noble gas emissions.
The poster describes the history of molybdenum production in Argentina and the conversion from HEU targets to LEU targets. The origin of the different emission streams of radioactive noble gases into the environment and their relative abundance are examined. Finally, different devices to reduce their emissions are presented, analyzing their advantages and disadvantages.
Radiocesium (Cs-137) is one of the by-products of nuclear fission processes in nuclear reactors and nuclear weapons testing. The deposits of Cs-137 on the ground are resuspended into the atmosphere by different mechanisms like strong winds and forest fires. The major sources can be accidents from nuclear reactors, such as the Chernobyl power plant accident and Fukushima as well as historic nuclear weapons tests in the atmosphere that occurred mainly from 1952 until 1962. This study aims at assessing the global appearance of Cs-137 in IMS samples of more than two decades and identification of major sources for repeating observations at the same station. Frequency comparison before and after Fukushima as well as isotopic ratios of Cs-137 and Cs-134 will be performed. Moreover, further analysis on Atmospheric Transport Modelling (ATM) with Web connected graphic engine (Web grape software) for source location detection will be performed for backtracking the source of the detected Cs-137, identification of the potential source region of the release as well as discriminating the source of such release, whether it is from nuclear testing or any other nuclear facilities.
A radiological emergency preparedness system in Korea has been developed to predict the behavior of radioactive material released into the environment and estimate the dose assessment for humans in case of a nuclear accident. The system is composed of atmospheric dispersion, marine dispersion, and dose assessment models, along with a graphic user interface module. It can evaluate the dispersion patterns of radionuclides in the air and ocean, and the short-term and long-term radiological effects of a nuclear accident on humans. It has been constructed on the web to allow users to access it easily and simply through an intrinsic IP address, username and password. The atmospheric dispersion, marine dispersion and dose assessment models have already been validated by model-to-model comparisons and measurements from the Chernobyl and Fukushima accidents. Especially, the atmospheric dispersion model is connected with numerical weather forecast data produced by Korea Meteorological Administration in real-time and the air concentrations are rapidly calculated in the system.
The International Monitoring System (IMS) monitors compliance with the Comprehensive Nuclear Test-Ban Treaty (CTBT) using information collected by sensors (radionuclide, seismic, infrasound, and hydroacoustic) placed around the world. Sensors, however, give limited information; hence scientists at the International Data Centre (IDC) rely on data fusion and atmospheric transport modeling (ATM) for producing reviews and bulletins. ATM codes often do not resolve local, sub-grid scale information. However, local terrain features can cause the plumes from two releases that are tens of meters away from each other to travel in completely different directions under the same weather conditions. Accurate characterization of the transport of gases and particles at early times after their release may play a crucial role in making an informative prediction at later times on the global scale. Large eddy simulation (LES) transport codes, such as Aeolus, resolve turbulence using physics-based equations but they are computationally expensive, and running them in an operational situation becomes intractable if many simulations are required. Therefore, we developed a three-dimensional temporal deep autoencoder-based model in complex terrain that can characterize the transport and dispersion of gas release at early times. This information can then be passed to global-scale ATMs, potentially improving their predictability.
Radionuclides produced from underground nuclear explosions can slowly migrate through the subsurface and vent to the atmosphere, where they are transported to detectors downwind. Underground test sites located in remote areas surrounded by complex terrain features, can greatly affect the detectability of radionuclides once they reach the atmosphere. We describe efforts to validate a modeling system that transports radionuclides originating from below the surface through the atmosphere using high-resolution large eddy simulations (LES) in complex terrain. A computational LES grid has been created containing nearly 100 million grids cells over an area of about 2.5 square kilometers focused above a tunnel used to support mining activities near the Roselend reservoir in the French Alps. This grid captures the substantial elevation changes in the area, including the interface between the land surface and the water reservoir. Given meteorological inflow data as a driving boundary condition, this model predicts where gases leaking from the subsurface will be transported and how the local terrain features influence detection versus distance. This system will be validated using data from a set of planned field experiments that will inject controlled amounts of tracer gases into the surface and measure the atmospheric concentrations downwind.
BAPETEN (Nuclear Energy Regulatory Agency of Indonesia) is currently developing the Indonesian – Radiation Detector Monitoring System (I-RDMS) as an early warning tool to detect environmental radiation exposure increases on the territory of Indonesia. This system aims to check the availability of representative and reliable real-time online data on environmental radioactivity as a Nuclear Early Warning System within the framework of national nuclear surveillance and preparedness. This placement is based on the suitability analysis that determines the areas with the fallout potential caused by the transboundary radioactive release. I-RDMS detectors currently have been installed in several locations in Indonesia, five of them are located at Jayapura CTBT Station, Kappang CTBT Station, Lembang CTBT Station, NTT CTBT Station, and Sorong CTBT Station. By installing I-RDMS detectors in various locations, it is projected that the system can detect radiological impacts directed to Indonesia from transboundary radioactive releases caused by nuclear weapons experiments or nuclear accidents.
The radioxenon signatures of 135-Xe, 133-Xe, 133m-Xe, and 131m-Xe are the expected and consistently observed radioxenons in the atmosphere. Testing of the Xenon International System at Knoxville, TN resulted in the detection of several previously unobserved radioxenon isotopes in the atmosphere 125-Xe, 127-Xe, 129m-Xe, and 122-Xe (via the decay of 122-I). These isotopes were periodically detected in the atmosphere from December 2019 until May 2021 when Xenon International concluded its testing. Atmospheric transport modeling suggested that the source of the radioxenons emanated from either Oak Ridge National Laboratory’s High Flux Isotope Reactor, their Spallation Neutron Source, or both. The signatures produced by these atypical radioxenons can veil the signatures of the typical four radioxenons and lead to unreliable activity concentration calculations. There are only a few spallation sources and research reactors worldwide, but it may be possible for some of these isotopes be observed in nuclear monitoring systems.
To categorize radionuclide data from the International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty, the French National Data Center has developed, and has been using in operation for several years, an automated simulation of the industrial background. For all IMS noble gas stations, it calculates every day the expected Xe-133 activity concentrations due to industrial contributors. This provides valuable information for source attribution in case of detection. However, a systematic quantification of simulation uncertainties remains to be addressed, and is key to further improve categorization processes. In this context, the present study focuses on the use of ensemble data as provided by weather prediction centers to factor in meteorological uncertainties in the atmospheric transport calculations. Simulations of Xe-133 released from the IRE, a medical isotope production facility in Belgium for which emissions are monitored in real time, are conducted using US National Centers for Environmental Prediction meteorological ensemble data. Predicted activity concentrations resulting from this method are compared to measurements with the focus of showing the benefits of an ensemble approach for CTBT applications.
INVAP´s STAX monitor was installed and is measuring at the Ezeiza Atomic Center (CAE-CNEA), Buenos Aires, Argentina since November 2021. Initially, the equipment underwent aliveness tests and calibration with gas like radioactive sources. The electronic set-up was adjusted, in order to optimize measurement performance, thus enabling to track high activity concentration emissions in a low-dilution regime. Currently, the monitor is operating under real operative conditions, following up the production runs at this Medical Isotope Production Facility (MIPF). Equipment characteristics and examples of some results and performance being obtained, following up production runs at Ezeiza MIPF, are presented.
A numerical study of conjugate flow, heat and mass transfer by natural convection of noble gases within an underground cavity partially filled with molten rock is presented. The molten rock is initially considered at rest at an initial temperature and concentration. The molten rock is viscous and possesses strength that is temperature- and crystal fraction-dependent. Under natural conditions, convection cells are developed within the molten rock leading to circulation, mixing and degassing of the initially trapped gases. Furthermore, the molten rock as well as the degassing enhances the conjugate convection flow in the air gap within the cavity. We illustrate the onset of the different regimes and their combined effect of flow, heat and mass transport of different gas species, the fraction of molten rock and their impact on the noble gas fractionation. We also present a sensitivity analysis of the effect of the outer cavity boundary condition on the heat loss and cooling to the adjacent rock formation and its eventual release to the atmosphere. We demonstrate several scenarios of underground prompt releases to the atmosphere using a first-ever fully coupled prompt subsurface-to-atmospheric transport without ad hoc boundary conditions between physics-based domains or handshakes between different numerical codes.
The radiological station located in Havana, Cuba will perform measurements for the International Monitoring System. The Turkey Point and Laguna Verde Nuclear Power Plants are potential sources of noble gases on a regional scale. The objective of this work is to identify the synoptic situations that would contribute to the transit of air masses coming from these facilities through Cuba. For that purpose, the forward trajectories of the air masses were estimated by means of the HYSPLIT trajectory model. The simulations were run at an altitude of 500 meters and in periods where Cuba was under the influence of a classic cold front, the nearby influence of the Oceanic Anticyclone, the influence of a low pressure center of wide circulation in the western region and the influence of a weak barometric gradient. The results obtained with the HYSPLIT trajectory model showed that the transit of the air masses coming from the Turkey Point Nuclear Power Plant would only occur if Cuba were under the influence of a weak barometric gradient. In the case of the Laguna Verde Nuclear Power Plant, it was found that for none of the synoptic situations analysed would the air masses cross the Cuban territory.
Atmospheric transport modelling (ATM) supports the radionuclide verification technology by providing a link between radionuclide detections and the regions of their possible source. In the case of an anomalous detection registered by the International Monitoring System (IMS) particulate network classified as Level 5, CTBTO sends a request for support to Regional Specialized Meteorological Centres (RSMCs). In response, they produce and upload their own backward simulations. Taking advantage of additional models delivered by the CTBTO-WMO response system, a Multiple Model Possible Source Region (MMPSR) can be calculated. For this study, we will look at the radioactivity measurements registered in the aftermath of the Fukushima incident by the IMS network. Due to the scale of this incident, anomalous anthropogenic detections (classified as Level 5 or Level 4) and anomalous radioxenon detections (classified as Level C) were frequently observed. For the selected samples, the enhanced version of the MMPSR algorithm will be used. The added value of using not only the ensemble of ATM models, but also multiple radionuclides for the purpose of PSR calculation, will be demonstrated.
In the International Data Centre radionuclide pipeline, noble gas samples from the International Monitoring System (IMS) radionuclide stations are processed and categorized. The combination of multiple isotope observations of Level C plus prior/post Level B samples close enough in time, at one or more IMS radionuclide stations, forms decisive input for isotopic ratio analysis as these episodes may show consistency in the isotopic ratio evolution of at least two isotopes in a pair. For atmospheric transport modelling, these multi-sample episodes form the starting point to deterministically link the air masses to a certain release through correlation computations yielding the so-called possible source region (PSR). By combining PSR products with isotopic ratio analyses, a better association of radioxenon detections with their release location and release time is aimed for. We present case studies that investigate the comparison of two approaches to compute the PSR. The standard PSR approach utilizes all samples (Level B and C) in a fixed timeframe relevant for a specific scenario, whereas an alternate approach might use a curated group of samples selected for consistent isotopic ratio evolution only.
Radionuclide emissions from worldwide nuclear facilities are frequently observed by the CTBT noble gas network. These ever-present and highly variable emissions of the four radioxenon isotopes relevant for CTBT monitoring weaken the abilities of global monitoring of nuclear explosions. This multifaceted problem requires a substantive approach to determining the key steps for distinguishing for each International Monitoring System sample whether the observation can be explained by known sources, or whether it possibly contains a contribution from a nuclear explosion. For this purpose, the Xenon Background Estimation Tool (XeBET) is being developed. XeBET aims to deliver an aggregation of scientifically-developed ideas into a software prototype that subsequently may be used for expert technical analysis once demonstrated and agreed upon. Ideas considered in XeBET are built on atmospheric transport modelling and radionuclide statistical expertise from assessments in previous multilevel and multidisciplinary scientific investigations. This presentation discusses XeBET’s current status of ideas and prototyping and assesses the challenges ahead for 2023 and beyond.
The backgrounds of radionuclides in the environment from natural and human made phenomena are well known to interfere with detection of nuclear explosions. International Monitoring System (IMS) stations perform measurements at ground level; however, little has been published regarding the vertical distributions of radionuclides expected from nearby and distant sources and if there is a large enough difference expected that could be exploited to increase signal-to-noise ratios. For example, local and regional topography, energy of injection, height of release, vertical shear, ground resuspension, building wake effects, convective mixing and other effects can have a significant impact on the vertical distributions of radionuclides detected at an IMS station. In this work, we present a set of calculations and experiments that could be conducted near and distant from known emission sources to test whether it is viable to decrease the effect of backgrounds at IMS stations. We also characterize the tradeoff between the vertical dilution of radionuclides and increased spatial representativeness of higher elevation samples and estimate the potential to use vertical gradients to better constrain background emissions.
The International Monitoring System (IMS) detects background radioxenon from civilian sources every day. This measurement of the background can interfere with the measurements of radioxenon from nuclear tests. There are methods to discriminate civilian sources with nuclear tests. These include the use of atmospheric transport modelling to determine if the air associated with an IMS measurement is associated with a known civilian source, and the use of multiple radioxenon isotopic ratios to discriminate between nuclear power reactor releases and nuclear tests. The STAX project has been working with both medical isotope production facilities and nuclear power plants, both of whom have voluntarily shared radioxenon emission data. This work will compare real world radioxenon releases and isotopic ratios from two types of nuclear power reactors with the expected ratios from nuclear tests and commercial fission based medical isotope production facilities.
Measuring radioxenon isotopes is one of the tools used for underground nuclear explosion monitoring. Medical isotope production and reactors also release radioxenon isotopes and discrimination of these benign sources is a key research area. Four isotope ratios have been shown to be a powerful method for discrimination if the isotopes are detected. This research has examined the expected radioxenon activity concentrations at various distances for a variety of release scenarios and detecting all four isotopes was observed to be very challenging and potentially only viable for large yields or release fractions. However, three radioxenon isotopes are more detectable over a wider range and may be key to discrimination. There has also been recent interest in other isotopes, such as Argon-37, that when combined with radioxenons, can provide more robust discrimination. The scenarios that were explored will be described, and results presented on potential radioxenon detections, using current and next generation capabilities, as a function of time and distance after a simulated event.
Exploding wires have been embedded in geologic matrices (concrete, basalt, sandstone, and granodiorite) to study explosive driven elemental diffusion and to elucidate the utility of elemental signatures. The exploding wire detonations have an input energy of up to 900 J equivalent to 0.22 g TNT. Simultaneous to the detonation, monitoring of the air concentration of elements was performed using Inductively Coupled Plasma Mass Spectrometry (ICP-MS) in a parallel hole drilled into the rock sample at various spacings from the exploding wire placement. A wide range of elements (Cu, Zn, Br, Rb, Y, Sn, Sb, Te, I, Nd, Hg and Pb) have been detected propagating through the rock with a range of delays. The Hg pulse usually arrived earliest at the detection location, followed by most other elements. Iodine always displayed the longest arrival delay and remained detectable at the greatest distance, long after any explosion related thermal and pressure gradients should have dissipated suggesting highest mobility.
The radioisotope Argon-37 is produced in underground nuclear explosions (UNE) through the neutron activation of Calcium-40 in rock and soil. A sensitivity study was conducted using Monte Carlo N-Particle Code (MCNP) and SCALE to model the predicted production rate of Argon-37 per kiloton explosive equivalent in various rocks following a UNE. The detonation was modelled in MCNP using a simple geometry to estimate the neutron flux further away from the detonation. This neutron flux from MCNP was the input to SCALE to model the yield and decay of Argon-37 in each rock. The reaction cross section of Calcium-40(n,α)Argon-37 is not well known, so both threshold and 1/v cross sections were modelled. The sensitivity study revealed the importance of characterizing the thermal neutron cross section to improve our understanding of the predicted production rate of Argon-37 from UNEs. It also showed the importance of radioargon as a signature from UNEs since it can be detected up to 700 days after a detonation. An experiment was designed to measure the thermal neutron cross section using alpha spectroscopy at the University of Texas at Austin.
Monitoring radionuclide concentrations is an important component of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) verification regime and can provide direct evidence of the nuclear nature of an explosion. Whereas, in the case of underground nuclear testing, the radioxenon isotopes have the highest probability of escaping into the atmosphere. Four of the xenon isotopes are of interest for CTBT verification; Xenon-131m, Xenon-133m, Xenon-133 and Xenon-135. The current International Monitoring System (IMS) design foresees 40 radioxenon stations around the world to provide a 90% detectability of a 1 kt nuclear explosion within 14 days. In 2010, the first radioxenon station was integrated into the IMS. At the end of 2022, 26 systems are already certified and 5 additional were installed and already in operation. This work provides a systematic study of the CTBT-relevant xenon isotopes' global background in ten years (2013-2022) using all the available stations. The geographical distribution and the ratios between different Xenon isotopes were analysed.
Following an underground nuclear explosion, fission products may be vented to the surface and transported through the atmosphere. Traditional requirements for nuclear explosion monitoring systems have focused on simple release scenarios. A more rigorous evaluation of radionuclide inventory releases will provide better requirements for measurement systems and improve analysis of detections. The goal of this work was to compare isotopic signatures reaching monitoring stations under different venting scenarios. First, a radionuclide inventory of fission products was developed using the SCALE code system based on the fission of U-235. The release of fission products to the surface was modeled in two components: prompt and delayed releases. Both the prompt and delayed components were varied to produce 63 total scenarios (example: 0.1% gas vent prompt release and no delayed release). Dilution factors for the resulting plume reaching a selected subset of IMS monitoring stations were found using HYSPLIT, and the simulation was repeated over a period of 366 days. Detection frequency and ratios for multi-isotope detections are presented for the various release scenarios.
Current noble gas detection systems for nuclear explosion monitoring are based on the detection of four radioxenon isotopes – Xenon-131m, -133, -133m and -135. The data provided by radioxenon detection could be enhanced by other radionuclide signatures, such as Argon-37. Activation of Calcium-40 in rock by neutrons produces Argon-37, and monitoring for this additional nuclide could help distinguish detections of nuclear explosions from background sources, like medical isotope production. This work studies the capabilities of a hypothetical Argon detection network. A 10 kt explosion was modeled using MCNP and SCALE to determine the inventory of Argon-37 created in a representative granite rock layer, assuming either 0.1, 1 or 10% of the total inventory was released. The Argon-37 inventory was combined with atmospheric transport data from HYSPLIT compiled in a previous study, along with the detection limits of standard Argon-37 detection systems, to determine how many hypothetical monitoring stations would detect Argon-37 from an explosion. This method was repeated for 365 HYSPLIT data sets to create a year’s worth of hypothetical explosions, releases, and detections. The study quantified the average number of detections per release, the number of stations detecting Argon-37, and the possibility of detecting Argon-37 in coincidence with Xenon.
The International Monitoring System (IMS) noble gas network has proven to be highly reliable, with many years of routine measurements sent to the International Data Centre. With the deployment of noble gas systems, international experts began to routinely see a noble gas background. This background has continued throughout the years, and every single day, the IMS has a radioxenon measurement reported. These backgrounds come from civilian sources such as nuclear power plants and fission based medical isotope production facilities. An effort is underway to measure these releases at its source for use by experts to understand the background and their impact on the IMS. This work will highlight the STAX project effort to measure the sources, and the data’s uses in understanding radioxenon measurements in the IMS.
As part of the Xenon Environmental Monitoring at Hartlepool (XENAH) collaboration, a team of scientists from the UK, US and Sweden have deployed three radioxenon sampling and measurement systems to the north of England, near to the Hartlepool Power Station. The power station comprises two 1600 MW(th) advanced gas-cooled nuclear reactors. The array of SAUNA QB (“cube”) radioxenon measurement systems have been in operation since March 2022 and have detected Xenon-133, Xenon-131m and Xenon-133m – isotopes of Xenon that can be used as key indicators of a nuclear explosion. This collaboration seeks to better understand the impact of civil nuclear reactors on the global radioxenon background. This regional array of sensors offers improved location reconstruction accuracy when compared to global networks, such as the International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty. This work will present the latest information from the XENAH QB array, including an in-depth radionuclide and atmospheric transport modelling analysis of the array sensor data.
During the testing phase of the Xenon International radioxenon monitoring system in Knoxville, Tennessee, USA, there were observations of non-traditional xenon isotopes: Xenon-125, Xenon-127, Xenon-129m, and Xenon-122 (via the decay of Iodine-122). While the production mechanisms for non-traditional isotopes were hypothesized, it would be beneficial to perform a complete study on the production scenarios for the non-traditional xenon isotopes compared to the standard radioxenon isotopes of Xenon-135, Xenon-133, Xenon-133m, and Xenon-131m. One production mechanism that is of particular interest following the observations of Xenon International is a spallation neutron source. While there are several spallation neutron sources around the world, the production of non-traditional radioxenon isotopes depends on parameters like the target material, beam energies and gas abatement. We have investigated the production mechanisms of the non-traditional isotopes and developed a model for predicting the amount of non-traditional xenon isotopes compared to traditional xenon isotopes that are produced through methods like neutron spallation.
It has previously been reported at Science and Technology conferences that PNNL has measured Argon-39 at historic underground nuclear explosion (UNE) sites at the Nevada Nuclear Security Site in gas samples from shallow (few meters or less) subsurface soil. Considerable Argon-39 was observed at UNE sites sampled. Thus, the detection of Argon-39 in such samples at strengths sufficiently above background can help identify possible UNEs, though Argon-39’s long half-life precludes constraining when the UNE occurred. While a published Argon-39 background value for atmospheric air exists (16.6 mBq/m3 whole-air equivalent), there were no published Argon-39 background values for shallow subsurface air samples. We report on such measurements at a number of locations across the western United States of America, in an attempt to characterize the range of backgrounds that might exist. The measured concentrations varied from the published atmospheric concentration to about 3.5 times that. The measurements, analysis, locations and results will be described and compared with measurements taken at UNE locations.
Given that the on-site inspection area should not exceed 1000 km2, finding the suspected nuclear explosion site is a difficult task requiring extremely accurate methods of assessment. One of the approaches used for nuclear explosion monitoring is to consider sources of radioxenon. Currently, mainly nuclear power plants and medical isotope production facilities are considered as a relevant and well-known sources of radioxenon that can impact International Monitoring System observations. The aim of this work is to investigate the possible radioxenon emission during SNF reprocessing caused by spontaneous fission of heavy elements. Curium-244 and Plutonium-240 are determined as main sources of spontaneous fission. Their presence in SNF leads to the formation of radioxenon and I-131 which are released during head-end operations. The approximate calculations were performed to quantify the amount of radioxenon formed and released during SNF reprocessing. It is estimated that the maximum release of radioxenon may be in the order of GBq/day and depends on fuel burnup and other parameters. Assuming the absence of an effective off-gas system which leads to the release of radioxenon into the environment, the results of the calculations show that industrial scale reprocessing plants should be considered as a weak but not negligible source of radioxenon.
The SPALAX-NG was qualified in 2021 for deployment at International Monitoring System stations. The SPALAX-NG has been operated for several years by CEA/DAM. The SPALAX-NG is a major evolution compared to first generation SPALAX with detection limits of 0.2 mBq/m3 in eight hours for all the relevant radioisotope of xenon, high resolution beta-gamma detection, modularity. This presentation provides the feedback on the use of the system in an operational context and the study of several relevant detections observed during the qualification periods.
Argentina is one of the producers of radioisotopes and the construction of RA-10 responds to the increase in global demand. Increased production of radioisotopes could lead to more release of noble gases. This modern reactor is designed as a multipurpose facility suitable for radioisotope production, material and fuel irradiation research and neutron techniques applications. It is planned to increase the weekly production of Mo-99 from 900 6 days-Ci to 3500 6 days-Ci. During the Mo-99 purification process, fission gases containing Xenon-133 and Xenon-135 are released into the atmosphere. The design of the production plant includes the improvement of the engineering and the necessary devices to minimize the emission of noble gases. The International Monitoring System uses fission gases such as Xenon-133 and Xenon-135 to monitor the Earth for signs of a nuclear explosion. The production of medical isotopes is the main contributing factor to the background of radioxenon in the atmosphere and these emissions pose a potential problem for monitoring nuclear tests if not addressed. Technical discussions are needed on the impact of radioisotopes released by civilian sources on monitoring nuclear explosions and how to maintain the detection capability of the International Monitoring System.
This work describes a method for estimating the location, moment and the amount of an unknown atmospheric release of radioactivity, based on detections. First, the source location is estimated by a probability density function based on the time-integrated correlation coefficient between model calculations and measurements of radioactivity concentrations. Model results are calculated with the adjoint transport equation, thereby reducing the number of atmospheric transport simulations to the number of detections. Next, the moment of release is defined as the moment when the correlation coefficient reaches its highest value. Finally, the released quantity is estimated by a least-squares method in which the residual is defined as the difference between observed and modelled concentrations. The method is validated with the ETEX-1 experiment and applied to the case of Ru-106 detections in Europe in September 2017.
In addition, apart from accurate models, proper training to use these models is a prerequisite for adequate radiological assessment. Therefore, a methodology is presented to perform realistic exercises for emergency response that can be used for scenarios with both unknown and known release locations. The method allows for real-time simulation of measurements of several operational detectors and measurement teams, thereby improving realism to training exercises.
References:
https://doi.org/10.1016/j.jenvrad.2021.106643
Of the existing nuclear reactors in the world, the region of the Group of Latin America and the Caribbean (GRULAC) currently has a total of six nuclear power plants operated by Argentina, Brazil and Mexico and 16 research reactors being operated by Argentina, Brazil, Chile, Colombia, Jamaica, Mexico and Peru. Using the RNToolkit web software that allows quick access to data from the International Monitoring System network, this work aims to evaluate the detections of stations ARP01, ARP 03, BRP11, CLP19, ECP24, RN31 and PAP50 to monitor the behavior of radionuclides detected in the region of GRULAC, influenced by the meteorological conditions of the region where different wind regimes are established, including the subtropical anticyclones of the South Atlantic and South Pacific, the low level jets to the west and east of the Andes and the trade winds. The evaluation period was from 2019 to 2021, with reference to the month of April, the beginning of the lockdown in the GRULAC region.
We are responsible for data analysis and management of radionuclide monitoring stations as the National Data Centre of Japan. There are two International Monitoring System radionuclide monitoring stations in Japan, at Takasaki for particles and noble gases and at Okinawa for particles. The detection frequency of Cs-137 is relatively high at both stations. In this study, we focus on Cs-137 detection and clarify the characteristics of the season and the pressure pattern of each detection. In addition, back-trajectory analysis using an atmospheric transport model will be conducted to estimate the release source of Cs-137, by comparing with the International Monitoring System data including the surrounding stations.
The Research Organization for Nuclear Energy analyses the dispersion of radioactive air release by using GENII V.2 software. The calculation of GENII V.2 has never been validated with the results of measurements directly and indirectly in the field. The validation of the GENII calculation of I-131 activity concentration was carried out by comparing the results of the GENII calculation of the I-131 activity concentration with the concentration of direct and indirect measurements (charcoal) in the settlement. Measurement of I-131 in the outdoor air is portable and attached to the top of the car. A sampling of filter paper and gas was in the indirect method. The sampling tool of I-131 is a regulated vacuum pump at a flow rate of 25 lpm and is turned on every one hour to 24 hours. Filter paper and charcoal were counted by using a detector of NaI(Tl) in situ. Validation of I-131 activity concentration from the analysis of radioactive dispersion by using GENII software was closer to the measurement results of I-131 activity concentration with the direct measurement method than the concentration of I-131 activity with the indirect measurement method.
The global atmospheric background of anthropogenic radionuclides hinders the ability of the Treaty monitoring community to identify possible signatures of a nuclear explosion. In particular, the atmospheric radioxenon background, produced and sustained by civil nuclear facilities such as isotope production facilities (IPFs) and nuclear power plants (NPPs), causes detections on the International Monitoring System (IMS) every day. Discriminating between the signatures of the civil background and those of nuclear explosions is essential to improve the analysis and interpretation of samples collected by monitoring facilities. To do this effectively, a better understanding of the possible emission profiles from NPPs is required. Based on simulations performed at the UK National Data Centre, we discuss the potential stable xenon isotopic signatures of NPPs, as well as how this stable gas composition differs from both the natural background and from an expected nuclear explosion signature. An assessment of the effects of the operational parameters of an NPP on the stable gas composition is also discussed.
Radioxenon is a gaseous fission product produced during nuclear weapons testing. Besides nuclear testing, the biggest radioxenon emission comes from medical isotope production facilities, which emit 109 -1013 Bq/day, while the power plant emits 109 Bq/reactor/day. However, the recent development of generation-IV reactors showed that a potential radioxenon release from a single molten salt reactor (MSR) could be higher than that from a conventional nuclear power plant (NPP) and could reach approximately Xe-133 1010 Bq/day. The aim of this work was to provide an update on the MSRs’ development in the world and to estimate the likelihood of an increase in overall radioxenon emissions. A literature review showed that MSRs could be used for two main purposes: electricity production and minor actinide transmutation. China (TMSR), Canada (IMSR400, SSR-W), the United States (KP-FHR, Mk1 PB-FHR, LFTR, and Thorcon), the United Kingdom (SSR-U), Denmark (CMSR), and Indonesia (Thorcon) are all working on MSRs for electricity generation with capacities ranging from 40 MW (t) to 600 MW (t), while the Russian Federation plans to use MSR to utilize minor actinide. According to research, the first increase in global radioxenon emissions due to the MSR's operation could be expected in 2028.
Nuclear event radionuclide detection is the next step towards complementing data about any abnormal event recorded by seismic, hydroacoustic and infrasound station networks prior to asking for the approval of the on-site inspection (OSI). Radionuclides travel hundreds of kilometres away from their source under favourable meteorological conditions. In order to determine and assess anthropogenic radionuclide emission and determinedly distinguish it from background global nuclear tests or previous emissions from nuclear facilities, it is favourable to screen background “zero point” anthropogenic radionuclide isotopic composition and activity values around operating nuclear facilities. In this work radiochemical radionuclide separation, alpa-, gamma- and mass-spectrometry measurement techniques were combined together in order to determine and assess anthropogenic radionuclide activity and its isotopic composition in soil samples within 70 km radius around Astravets nuclear power plant (NPP) in Lithuania territory. Alpha spectrometric measurements were performed with state of the art “Ortec” alpha spectrometer while gamma spectra were recorded by SILENA gamma-spectrometric systems with an HPGe coaxial detectors. Pu isotopic ratios were measured by a sector field mass spectrometer. Cs-137/239, Pu-240, Pu-238/239, Pu-240, Pu-240/239Pu isotopic values revealed that a global Northern hemisphere nuclear test fallout is prevailing in major sampling sites within 70 km radius around Astravets NPP in Lithuania territory.
In 2017, the Government of Japan decided to make a voluntary contribution to further enhance the capabilities of the CTBTO verification regime. In that framework, two transportable noble gas systems were deployed in Horonobe and Mutsu, Japan. They respectively started operating in February 2018 and March 2018. Continued operation of the two systems is now financially supported by funding from European Union Council Decisions. As of today, a few hundred samples have been collected and measured in Mutsu and Horonobe. Spectra are automatically sent to the International Data Centre (IDC) and processed in a non-operational database. They are routinely reviewed with focus on the four xenon isotopes of interest for the CTBTO (Xe-131m, Xe-133m, Xe-133 and Xe-135). Analysis results and raw data are made available to States Signatories through the Secure Web Portal. This work presents the status of the ongoing measurement campaigns, together with preliminary analysis results of observations at both transportable systems.
The radioactive xenon isotopes Xe-131m, Xe-133, Xe-133m and Xe-135 are important indicators for an underground nuclear explosion. Knowledge about the concentrations and ratios of these isotopes that can be expected due to natural processes is important to be able to discriminate from a nuclear explosion during a CTBT on-site inspection. A series of measurements has been performed 2019 and 2022 under different weather conditions within a limited area in the region of Kvarntorp (Sweden), a location with known elevated uranium content in the ground. These studies aim to understand the variation, due to e.g. meteorological conditions and radon concentrations, and try to set an upper limit on expected natural concentrations of the xenon isotopes in soil gas. The processing and transfer times have been optimized to increase the detecting sensitivity to increase the potential of detecting short lived isotopes, and the result from these campaigns will be presented.
Argon-37 is produced via neutron activation of stable argon or calcium in nuclear reactors. This isotope is used in on-site inspection (OSI) activities. Therefore, its background in the atmosphere from nuclear reactors is an important parameter in interpreting the OSI measurement results. Since its measurement is complicated, only few reports exist on the amount of its production in the reactors. The radioxenon isotopes or Ar-41 can be used as a proxy for Ar-37, if both are produced through neutron activation. The activity ratio of the proxy to Ar-37 can be used to determine the source term of Ar-37. In this presentation the parameters affecting the simulation of these ratios are presented, most importantly the cross-section. The results for different cross-sections are compared and discussed.
Monitoring compliance with the Comprehensive Nuclear-Test-Ban Treaty requires, among other things, accurate modelling of the transport of radionuclides in the atmosphere. Crucial to atmospheric transport models (ATMs) when dealing with radioactive particulates are several removal processes, such as dry and wet deposition. Wet deposition in particular plays an often dominant role in the total removal of radioactive particulates from the atmosphere, especially on CTBT-relevant timescales. Despite this, the simulation of wet deposition still remains difficult in large part due to uncertainties in the parameterization schemes used in state of the art ATMs. In this presentation, we show how better knowledge of wet deposition can improve nuclear-test-ban monitoring. For this, we perform atmospheric transport calculations with FLEXPART and also incorporate data from CTBTO’s International Monitoring System.
An activity evolution model which goes from the release of an assumed underground nuclear explosion, through the atmospheric transport modelling (ATM), to sample collection and measurements can be used to link various quantities, including activities released from a nuclear event, activity concentrations in a plume over an International Monitoring System (IMS) station and activities collected in samples. The concentration profile at an IMS station can be estimated by using an assumed release scenario and ATM forward simulations. Activities collected in samples are determined by spectrum analysis, then the concentrations are estimated based on an assumption on the concentration profile. Generally, concentrations are assumed constant during sampling. It might be a challenge for isotopes with short half-life, such as Xe-135 (9.14 hours), compared to the collection duration of 12 hours. In this work, the decay correction during sampling is investigated using two approaches, 1) the collection duration is divided into multiple intervals in which a constant concentration is assumed in each one respectively; 2) the activity collected in the sample is based on an analytical solution of ordinary differential equations regarding activity decay and accumulation. The impacts on isotopic ratios are demonstrated in three cases, Ba-140/La-140, and Xe-133m/Xe-133 and Xe-135/Xe-133.
In this study we have used open access data from NASA Earth Science data System and meteorological data from the data set of the International Monitoring System (IMS) stations to find out correlations with the observable detections of radionuclides at the RN43 station in Nouakchott. The data fusion between IMS data and open access data can be considered as a tool for National Data Centres to understand detections and levels at IMS stations. Furthermore, it can be used to verify seasonality of anomalous detections. Some suggestions on how to introduce new parameters and data in the civil and scientific application section of the CTBTO secure web portal will be also presented. This work has been performed in the framework of a joint collaboration between the Italian and the Mauritanian National Data Centres.
Detection of radionuclides generated from underground nuclear explosions depends, first and foremost, on the movement of nuclides from source to detector. The subsurface environment therefore has a critical role in determining how and when material signatures become detectable. Field-scale tracer experiments are essential to furthering scientific understanding of how the subsurface contains nuclear explosion signatures. However, the inaccessibility of the below ground environment remains a significant hurdle in using such experiments to validate material transport models. Scientists at the Pacific Northwest National Laboratory have collected a large geological site characterization data set using a dense multi-phenomenological sensor network with the objective of testing new methods for evaluating rock properties needed for validating subsurface transport models. By combining electrical conductivity tomography and seismic velocity measurements of this site, three-dimensional maps of the testbed porosity, permeability and water saturation were estimated. These mappings served as the framework for transport modeling at the site. Results of these simulations have been compared to experimental injections of nitrogen gas in the field to evaluate the joint inversion methodology.
Noble gas systems at International Monitoring System (IMS) radionuclide stations used to operate with 24 or 12 hours sampling time. The next generation noble gas systems utilize shorter sampling periods. At station RN33 on Mount Schauinsland, Germany, a SPALAX system with 24 hours sampling is operated by BfS. A Xenon International system with six hours sampling time was installed in parallel from July 2021 to April 2022. The main contributing emitter to elevated xenon activity concentrations at RN33 is the medical isotope production facility at Fleurus, Belgium. In a former study, backward ATM for SPALAX samples with Xe-133 activity concentrations above 2 mBq/m³ has shown that the majority of source-receptor sensitivity (SRS) fields coincides in the region around Fleurus and therefore indicates the presence of a repeating emitter. In this study we investigate how the increase in time resolution in sampling and ATM changes the location capability of backward ATM. For that, the Lagrangian Particle Dispersion model HYSPLIT (NOAA-ARL) is applied driven by ECMWF-ERA5 meteorological data in hourly resolution. As the Xenon International system also allows for additional detections particularly of Xe-135 and isomers, the sensitivity to unknown additional sources is potentially improved and analysed.
Radon is a naturally occurring radioactive gas produced as a decay product of uranium and thorium in geologic media. Radon is therefore a nearly ubiquitous gas background in the subsurface environment, as well as in many building materials derived from earth materials. In the context of nuclear explosion monitoring, radon as a background influences radiation detector sensitivities as an interfering signal and its presence in whole air gas samples is also an indicator of atmospheric and subsurface air mixing. In recent years, radon has been of increased interest as a signature of underground explosions with the notion that earth damaging activity is likely to release trapped geogenic gases. Pacific Northwest National Laboratory scientists have utilized a mesoscale geologic testbed in New Mexico outfitted with eight multi-interval gas sampling boreholes to study subsurface gas migration. As part of this work, multiple series of radon measurements have been made within the more than 50 sampling locations before and after small chemical explosive detonations. In conjunction with other measurement and characterization techniques employed at the site, field results were used to assess the local variability of radon gas in the testbed as well as quantify changes in radon background levels resulting from explosively generated damage.
P-to-S receiver functions of 10 broad-band seismographs installed along the geologically complicated edge of the Ethiopian plateau and the active Main Ethiopian Rift were examined to image the crustal structure beneath the region. Receiver functions were determined using the time domain iterative deconvolution method to calculate the Moho depth and Vp/Vs of the crust. Results indicate that the Moho depth beneath the Northwest plateau, the Central Main Ethiopian rift, and the Southeastern plateau is 36–44 km, 36–38 km, and 40–44 km, respectively. A very high Vp/Vs > 2.0 is observed beneath the Enewari depression at the NW plateau at the depth range of ~ 30–40 km under a high velocity material. Likewise, a similar high Vp/Vs material is also found beneath the rift axis at the depth range of ~ 30–46 km beneath a high velocity solidified material. These high Vp/Vs ratios at the top of the lower crust in the Northwest plateau and MER are inferred to be seismic signatures of a low Vs partial melt material. The high Vs and the low Vp/Vs material above these high Vp/Vs materials might be solidified magmatic material.
The Earth is a dynamic planet with abundant vibrating processes. Besides the earthquakes, volcanos and other activities, there is a special type of source called a persistent localized microseismic source, with long period almost harmonic signals and fixed location. The 26s (0.038Hz) and 28s (0.036Hz) tremors in the Gulf of Guinea are two typical persistent localized microseismic sources in the world, but their generation mechanisms are still enigmatic. Moreover, understanding the behaviors of these two sources helps to reduce their interference in ambient noise tomography. We implemented an algorithm to detect events in the persistent localized microseismic signals for the past 30 years, and then performed statistical analysis of magnitude-cumulative number and interval time-number. We found that the magnitude distribution is similar to the Gutenberg-Richter relation (G-R relation or power law) and the distribution of interval between events is consistent with a Poisson process. We propose that the two sources are probably related to underground complex crack networks featuring fractal characteristics and are dominantly driven by temporally random dynamic processes. However, ocean swell might affect the 26s source occasionally while the primary microseism seems to modulate the two persistent localized microseismic sources.
The purpose of this research is to check the performance of the RsttMndc (enhanced RSTT model for Mongolia) model as a regional velocity model and the obtained minimum 1D velocity model for the Hangay region as a local model to relocate the events occurred in the Hangay region, central part of western Mongolia using iLoc4.0 location software comparing with the relocation results with RSTT model. The Institute of Astronomy and Geophysics of Mongolia in collaboration with the Lehigh university in Pennsylvania of USA operated a temporary seismic station network called Hangay experiment in the region including 72 stations during the 2 years. We found the events provided GT5 criterion from the more than 8000 events occurred in the Hangay region within the latitude of 44o-50o and longitude of 95o-104o between 2012 and 2014. There were GT5 events recorded within the range of local distance. In order to show the RSTT models impact, we selected the events recorded at the regional distances too. The results of the GT5 events relocations with the different models will be discussed here. The main characteristic of the epicenter distribution relocated with the local and regional velocity models for the Hangay region mostly distributed and clustered along the main faults in the Hangay region.
Machine Learning for Travel Time emulation (MaLTTe) is a deep learning method and computer code for emulating seismic-phase travel times that are based on a three-dimensional (3-D) Earth model. Greater accuracy of travel time predictions using a 3-D Earth model are known to reduce the bias of event location estimates and improve the process of associating detections to events. However, practical use of 3-D models is challenged by slow computational speed and the unwieldiness of pre-computed lookup tables. MaLTTe uses the XGboost method and trains on pre-computed travel times, resulting in a compact and computationally fast way to approximate travel times based on a 3-D Earth model. MaLTTe is trained using approximately 850 million P-wave travel times based on the LLNL-G3D-JPS model from randomly sampled event locations to 10,393 global seismic stations. After training, the MaLTTe code is approximately 10 Mbyes in size and travel times are computed in approximately ten of micro-seconds on a single CPU. Currently achieved prediction accuracy is approximately ~0.2 second, which is significantly smaller than the inherent accuracy of the 3-D model. With additional development, MaLTTe will enable easy use of 3-D models in routine seismological processing and analysis.
iNSPIRE (iNtegrated Software Platform for Interactive Radionuclide rEview) is a software application that was developed by the CTBTO International Data Centre (IDC), based on modern Python/Qt framework. A first release, which covers beta-gamma coincidence-based detection systems in use at IMS stations, was deployed in IDC operations and delivered to National Data Centres (NDC) late 2020. The first release covers current and next generation noble gas technologies.
Over 2021-2022, the IDC developed phase 2 of iNSPIRE project by extending the software functionalities to IMS particulate and HPGe SPALAX based noble gas systems, with the aim of completing the migration to open-source and unifying the radionuclide analysis software tools for particulates and noble gas data at the IDC and in the NDC-in-a-Box package.
Additional goals include (a) moving to modern software development technologies, (b) optimizing the software cycle management (shared libraries – same language), and (c) enhancing software modularity for integrating new technologies at IMS stations and new analysis methods.
The contribution aims at illustrating implemented functionalities for particulates analysis.
The Standard Event Lists SEL1, SEL2 and SEL3 are automatically produced one, four and six hours, respectively, after the waveforms arrive at the International Data Centre (IDC) from the International Monitoring System (IMS) stations. The IDC analysts then interactively review SEL3 to generate the Late Event Bulletin (LEB), based on which the Reviewed Event Bulletin (REB) is being formed. Comparison of the SEL3 and LEB shows that about 20% of the LEB events are not present in the SEL3 and are added manually by the IDC analysts (SEL3 and LEB events are matched using the database field evid, the unique event identifier that is maintained even when the event parameters are modified). These events are commonly referred to as "missed" (by the automatic system) events. One of the ways to reduce the number of missed events is through configuration parameter optimization. Python scripts were developed to spot missed events and store them in a separate MySQL database, specifically created for in-depth analysis of such events parameters. Possible causes of missed events have been investigated based on a statistical analysis of the event parameters and geographical distribution to further improve the performance of IDC's automatic data processing.
The International Monitoring System at the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty continuously monitors the globe for nuclear explosions through seismic, infrasound, hydroacoustic, and radionuclide measurements to detect signals from possible nuclear explosions. The most complete understanding of the sources of measured signals can be achieved by combining the information from all four monitoring systems. The International Data Center automatically forms combined event hypotheses from the three waveform-based systems. The data fusion pipeline additionally combines waveform and radionuclide measurements to provide a complete view of possible fused events by correlating in space and in time the seismo-acoustic event locations with possible geographic source locations (Fields of Regard) for observations of CTBT-relevant radionuclides. These could be further studied by NDC analyst to investigate fused events that might possibly indicate a nuclear test event. In this presentation, we show the status of the data fusion pipeline and the browsing functionalities available to States Signatories. Ongoing developments as well as plans for future improvements will be discussed.
At the SnT2023, the evolution and sustainment of the International Monitoring System (IMS) was discussed by a panel chaired by the current IMS Director Xyoli Perez Campos and its former Directors Gerardo Suarez, Frederico Guendel, and Seismic Section Chief Sergio Barrientos.
The panelists examined the trajectory of the IMS from inception to its current state, and projected potential future directions.
The panelists highlighted that initial opposition to the IMS was overcome and it successfully proved its capability and efficiency.
A key topic discussed was the significance of IMS sustainment. The necessity to expand the Database of the Technical Secretariat (DOTS) to include probability statistics and failure data as well as budget information was underscored.
The challenges faced during the installation and certification of stations, supporting Member States in sustaining Auxiliary Seismic (AS) stations, as well as issues in the recapitalization of hydro-acoustic stations were noted.
The panel concurred that the IMS requires a robust budget to ensure its continued operation and improvement. They agreed that the inclusion of failure analysis and learning from other network operators and suppliers, including addressing obsolescence, would be beneficial.
The discussion underlined the enduring relevance of IMS, its successful growth despite early challenges, and the vital importance of continuous adaptation and improvement in the face of future operational and technological hurdles.
Discussion of panelists on following questions:
- What is the purpose of exercises in the development of OSI capabilities?
- Why we need to validate OSI capabilities in integrated manner?
- Is there another way to validate OSI capabilities?
On 20 December 2020 an earthquake shook Gaborone city and its outskirts. This earthquake was recorded by some stations of the Botswana Seismological Network and also some from the International Monitoring System. Vertical components seismograms from these stations were analysed using Geotool software from the Comprehensive Nuclear-Test-Ban Treaty Organization and the Regional Seismic Travel Time (RSTT) model to refine the location parameters of this earthquake. Geotool results are: epicentre location at Lat 24.714 oS and Long 25.973 oE; origin time of 10:44:46.5 (UTC); depth of 12.1 km; body magnitude (mb) and local magnitude (ml) of 4.2 and 3.6, respectively. RSTT results are: epicentre location at Lat 24.708 oS and Long 26.036 oE; origin time of 10:44:46.622 (UTC), and depth of 11.6 km. The location parameters agree well within uncertainties.
Keywords: Gaborone, Earthquake, Geotool, Regional Seismic Travel Time, Seismograms
The Bolivian Orocline is part of the Central Andes, where the high compressional crustal strains owns complex systems of geological faults that may be active and might produce destructive earthquakes, such as the Aiquile 1998. Generally, some foreshock and aftershock happens, however, those may have lower magnitudes that are difficult to detect. Nowadays, it is possible to detect any seismic or explosion event using a seismic network deployed in a specific region, therefore, any seismic station within the network will record seismic signals as waveforms templates. Furthermore, those signals and the bulletins might serve as input to train neural networks to detect or to classify natural or artificial events. The OSC-NDC presents how a daily routine workflow helped us to detect seismic clusters using three seismic stations, and how we enhanced those detections by installing a temporary seismic network within the Bolivian Orocline applying deep learning techniques. Results are promising, shallow low magnitude earthquakes around the Bolivian Orocline are being detected, most of them belonging to a seismic cluster and might guide us to map the seismicity.
Two seismic discontinuities delimit the mantle transition zone (MTZ) at 410 and 660 km. These seismic discontinuities were imaged under the north-west corner of South America using the receiver function technique and a seismological record of up to 30 years collected by the National Seismological Network of Colombia. Significant variations and spatially systematic in the discontinuity depths were observed. The mean depths for the MTZ are 412±5.3 and 671±5.9 km, with a mean thickness of 258±6.5 km. This last value doesn't differ too much from the global average. The significant depth variation and the thicker MTZ in some areas cause a low correlation between the depths of these discontinuities. We hypothesize that this thickness irregularity in the MTZ is due to the interaction between the Nazca and Caribbean tectonic plates during the subduction process beneath the South American lithosphere – Mantle system. On the other hand, the MTZ thinning beneath the Nazca plate possibly is due to a regional thermal disturbance or local plume under the Malpelo ridge. These observations and tomography images also support hypotheses regarding lateral variations in the thermal structure linked with the difference in age and composition of the Nazca and Caribbean plates in these depths.
A network of 20 seismogeodetic instruments was installed in the subduction zone of Mexico. The purpose of the data is to monitor the crustal deformation of the subduction zone, to estimate the magnitude of great earthquakes, and to contribute to the tsunami warning system of Mexico operated by the Mexican Navy. Information generated by the network is transmitted in real time by dedicated satellite antennas in L Band. The network is composed of instruments combining accelerometers and GNSS receivers. The strong motion data is sampled at 200 Hz. The GNSS data is processed in real time using RTX corrections and a recursive Kalman filter that uses the strong motion data to correct the real time GNSS data. The geodetic data is sampled at 2 Hz. The results show daily averages of ground displacement data comparable to post-processed data that includes orbital and atmospheric corrections. An algorithm was developed to estimate magnitude based on the post-seismic deformation of the coast. The data is monitored continuously, and the length of the fault is estimated based on the extent of the crustal deformation. The magnitude is estimated based on the fault area. Tsunami warnings would be based on the measured co-seismic deformation.
Having been nearly universally accepted, the Comprehensive Nuclear-Test-Ban Treaty has yet to enter into force. Before the CTBT can enter into force, however, 44 countries listed in Annex 2 of the Treaty must sign and ratify it. It is clear that the Treaty’s entry into force and implementation will be in the hands of the next generation of leaders. Given that the CTBTO Youth Group (CYG) is a group of over 800 young specialists from all over the world who direct their careers to contribute to global peace and security, they are able to involve new tools and technologies into promoting the CTBT and its verification regime. This paper aims to provide best practices and lessons learned from the Elysee Treaty in 1963 between France and Germany to be implemented by the CTBTO and CYG. We found that various background and citizenship of the CYG members provides us with an opportunity to raise awareness about the CTBT among various communities and in different countries by putting the Treaty on the agenda of many events and projects the CYG members are part of, and to link the CTBT with other global burning issues, which brings the Treaty closer to its universalization.
The CTBT enjoys universal support in Africa. However, not all African states appreciate or utilise the benefits of closer cooperation with the CTBTO PrepCom to, for example, achieve the UN Sustainable Development Goals and the African Union's Agenda 2063. Hence, the paper intends to analyse and assess the status and utilisation of the CTBT in Africa. The aim is four-fold. The first aim is to position the CTBT in the development agenda of the continent. The second aim is to analyse the cooperation between African states and the CTBTO PrepCom, whereas the third aim is to explore areas of potential and closer cooperation. In the final instance, the aim is to determine the application of the CTBT regime for disaster management, the detection and management of nuclear waste, and the management of the legacy of nuclear tests in Africa, and its territorial waters.
This article is dedicated to the issue of the Science Diplomacy Club (SDC) project. It is an emerging platform for young professionals in the field of scientific and technological cooperation. Nurturing a new generation of experts in the sphere of the CTBTO-related issues, such as non-proliferation, disarmament, and nuclear test ban remains a lynchpin of our activities.
The SDC policy is aimed at establishing a good rapport between high profile experts and young professionals. Among our traditional formats are “Career Talks” and “Presentation Contests”, where students have a brilliant opportunity to explore commendable qualities of science diplomats, practice their public speeches, and exchange views on advancing the CTBT.
Moreover, recently we have held the 3rd Science Diplomacy School, which focused on the burgeoning nexus between students from all over the world. The participants were engaged in interactive lectures and worked in groups while researching ever-more crucial topics. SDC also seeks to achieve gender equality by involving young women in Club’s activities, which forms an integral part of substantive contribution to furthering the UN Sustainable Development Goal number 5. In this work the authors have collected the statistics about members and activities of the SDC, which helped to deeply analyze the efficiency of our organization.
When the PTS was formed after the Treaty opened for signature in September 1996, the media landscape was very different. Social media platforms like YouTube, TikTok, Snapchat, Instagram, LinkedIn, Facebook, and Twitter did not exist. Most people around the world got their news from a handful of media companies (cable news outlets MSNBC and Fox News in the United States and Al Jazeera in Qatar launched in June, October, and November 1996, respectively). The digital information landscape was also very different. In 1994, two years before the Treaty opened for signature, there were 2,738 registered websites. Today there are more than a billion. Online publications, blogs, podcasts, viral videos, infographics, gifs, and animations are all ways in which people receive information, and in the case of breaking news people are just as likely to turn to Twitter (for now) as to the BBC. In 1996, the first members of Generation Z, now the largest in the world, were just being born. How do we cut through a crowded media landscape to mobilize stakeholders to advocate for the CTBT and its entry into force? How do we use modern communications tools to advance the nuclear non-proliferation and disarmament agenda?
The purpose of the National Data Centres for All initiative (NDCs4All) is to ensure equal distribution of Treaty benefits among States Signatories and to help them to receive International Monitoring System (IMS) data and International Data Centre (IDC) products.
Source parameters and source time functions of the five contained Democratic People's Republic of Korea underground nuclear explosions, 2009-2017, are obtained from MDJ seismograms by Bayesian inversion. The test site is a few kilometres across. Therefore, the Earth impulse response, or Green’s function, from each event to a given distant seismometer, is essentially the same. The Green’s function is estimated by deconvolving the seismogram for the modelled source time function. The likelihood of the Green’s function is defined as the reciprocal of the normalized root mean square difference of the Green’s functions obtained from two sources. Prior probability density functions (pdfs) are applied to the following independent source parameters: internal pressure, Poisson’s ratio, P-wave velocity and resonant frequency. The posterior pdf (PPD) is proportional to the likelihood of the Green’s function multiplied by the prior parameter probabilities. The estimated source parameters are obtained by a grid search to find the maximum PPD. The estimated Green’s functions for the five Democratic People's Republic of Korea events 2009-2017 to the MDJ seismometer are nearly identical, as they should be. The Green’s function from the test site to any other seismometer can be obtained by deconvolving the seismogram for the now known corresponding source time function.
Besides earthquakes, other sources can generate seismic and other observable geophysical signals that could potentially be misidentified or misinterpreted as explosions. One common source of such signals are collapses which, like explosions, are generally shallow and, consequently, have many identifying features similar to explosions. In this study, we collect, analyze and characterize collapses from around the world which are associated with mining activity or nuclear testing. While most of the collapses are from more recent activity, a portion of the signals are derived from legacy seismic recordings of post-shot collapses made during days of active nuclear testing. We present on progress we are making toward the development of a collapse source model, as complementary to the more commonly recognized earthquake (e.g. Brune) and explosion (e.g. Mueller-Murphy) source models. We also discuss and test several identification methods for collapses, including cross-spectral (low-to-high frequency) ratios, coda-derived source spectra and event identification on the moment tensor hypersphere.
The International Data Centre routinely applies event screening or discrimination in order to characterize events as either natural or anthropogenic. A number event discriminants are presented in literature. At the Kenya National Data Centre (KE-NDC), a systematic and step by step procedure of events discrimination are applied to seismic events. Results from the discriminants adopted are obtained within a short time and the discriminants are easy and fast to use. The discriminants used at KE-NDC are ranked in an hierarchy based on complexity and ease of returning results within the shortest time possible to allow events discrimination and dissemination of results. Some of the discriminants used include (i) hypocenter parameters based on events relocation, (ii) magnitude determination, (iii) mb:Ms criteria and (iv) focal mechanism determination. These discriminants and examples of events characterization will be presented during the SnT2022.
Sanctions are often imposed on countries with nuclear programs, such as Iran. The impact of the international sanctions and the United States’ sanctions on Iran, specifically since the deterioration of the Joint Comprehensive Plan of Action, has been extensive. The sanctions have crippled Iran’s economy and caused shortages in food, medication, and humanitarian aid. Accordingly, these sanctions have led to thousands of avoidable deaths in Iran, breaching the international right to health and right to life of Iranians. First, this presentation will outline how sanctions are not a viable solution for maintaining global peace. Instead, the Comprehensive Nuclear Test-Ban Treaty (CTBT) provides a viable, long term solution for international safety. Accordingly, the United States and Iran must ratify the CTBT. Under international law, both the United States and Iran are under obligations to reduce adverse effects of sanctions, which could include ratifying treaties such as the CTBT. Second, this presentation will discuss the positive outcomes stemming from Iran and the United States ratifying the CTBT. Lastly, this presentation will discuss avenues to raise awareness about the CTBT in the United States and in Iran. Specifically, social media and youth involvement will play an immense role in promoting the CTBT in both countries.
CTBT's Article II, Subsection 8 provision could provide an interesting framework for leading different collaborative initiatives aimed at achieving nuclear weapon control, reduction or elimination goals. These goals should be settled in a new context in which new deployment of nuclear warhead in NATO members, the starting new nuclear race, the possibility of constructing undetectable computer simulation based new weapons, and the potential nuclear arm proliferation. For supporting this CTBTO role, it is reviewed the main UN agreed treaty resolutions. A comparison of the faculties provided in these treaties will help to appreciate the relevance of CTBTO leading provision. Two main diplomatic models are available currently. The first one oriented to prohibit total or partially nuclear weapons by resolution (The Treaty on the Prohibition of Nuclear Weapons (TPNW), free nuclear weapon zones and bilateral and trilateral agreements, among others); the second, the negotiations framed within the Article VI of the Non-Proliferation Treaty. These two frameworks could be compatible. Our claim is that the CTBTO can lead or contribute decisively in leading negotiations in both frameworks. Additionally, a list of successful strategies that were used for creating nuclear momentum is made, and some possible collaborative initiatives are discussed.
The National Data Centres (NDCs) are crucial for advising their National Authorities on nuclear explosion monitoring, and in many cases, they play an important role as station operators and in transmitting IMS data to the CTBTO in Vienna. This panel will discuss the synergy between the CTBTO and regional experts. The PTS provides technical assistance and training for building the required national and regional capacity. NDCs can go beyond the capability of the IMS because they have access to regional network data that are useful for enhancing the monitoring capabilities. With their expertise, the NDCs provide feedback to enhance IDC products and services. The quality and effectiveness of Treaty monitoring benefit from this international cooperation.
ARISTOTLE-eENHSP (All Risk Integrated Trans-boundary Early-warning - enhanced European Natural Hazard Scientific Partnership) is a project financed by the European Civil Protection and Humanitarian Aid Operation (EC DG-ECHO) delivering real-time multi-hazard expert advice on worldwide natural disasters to the European Emergency Response Coordination Centre (ERCC).
ARISTOTLE-eENHSP was designed to offer a flexible and scalable system that can provide new hazard-related services. It is envisaged as a long-term operational, research, and cooperation plan building onto the proven expertise and multi-disciplinary partnership of world-leading scientific centres in Earth and Climate sciences.
Twenty-four national and international organisations are responsible for ARISTOTLE's three primary services that address ERCC specific needs for situations and target regions: Emergency Response, Routine Monitoring, and Scientific Technical Assistance Facility.
GeoSphere Austria, project co-coordinator, is involved in all ARISTOTLE services. It is one of the two project coordinators from the Service Management Team and a key partner in the Strategic Coordination Team. Furthermore, it provides real-time services between the ERCC and partners when scientific advice is required before, during or after natural catastrophes, and providing scientific advice about earthquake and weather hazards. GeoSphere Austria is also task leader for the training provided to the ERCC and the service quality control.
Rapid full source characterization provides useful information after the occurrence of an event of interest such as a nuclear test. Techniques using full moment tensor inversions based on long-period seismic waveforms recorded at regional distance helps to confirm the isotropic component of a seismic source, if any. Used on a grid of potential locations and by scanning continuous seismic waveforms, it is possible to implement a rapid detector of seismic events providing the full information of the sources (origin time, location, magnitude, mechanism). The applications of such implementation are numerous and range from rapid nuclear monitoring to seismic monitoring for seismological laboratories, to detection of large magnitude earthquakes for tsunami warning centers. Here, we present the developments made for the tool developed at CEA for the seismic monitoring of the North Korean region using only a limited number of seismic stations. We show its overall performances on past DPRK nuclear tests and regional earthquakes. We also show further advancements toward improving the monitoring of small magnitude (3.5 and less) earthquakes in France. On the opposite side, the approach is also under implementation at the French National Tsunami Center for rapid detection of potentially tsunamigenic earthquakes located in the Mediterranean Sea.
Besides improved instrumentation and operational measurement analysis, radionuclide technology used at National Data Centres (NDCs) also benefits from atmospheric transport modeling (ATM) in their dedication to detect, discriminate and characterize nuclear explosions. A way to enhance NDC expertise is to provide ATM with a likely radionuclide source term emitted to the atmosphere following underground nuclear explosions, thanks to evaluated scenarios.
We have developed a code with user-oriented digital tools to build such scenarios, qualitatively and quantitatively. STM_toolkit first computes the cavity radionuclide inventory for all the CTBT-relevant fission nuclides. Then, the nuclear test is configured by fusion of data from other technologies as well as tabulated data, leading to estimates of the cavity gas pressure and distance of migration, based on empirical laws derived from former nuclear tests. Rapid one-dimensional fluid mechanics calculations determine the gas released for i) prompt venting due to drill-back, with custom delay, or preexisting conduits or fractures, ii) late-time seepage due to barometric pumping. Gas release and cavity inventory determine the atmospheric source term converted to expected activity concentrations at monitoring stations thanks to empirically-parameterized simple ATM. Comparison with measurements allows the scenario to be refined before it is used for state-of-the-art ATM.
The paper attempts to understand India’s response to nuclear technology in the post-Fukushima period in the light of the debate concerning the safety, security and future of nuclear technology. Post-Fukushima period has generated a genuine, democratic and serious deliberation over nuclear technology which has called for public participation. As India has propounded multiple theories to justify its embarkment on nuclear path, an attempt is made to identify different actors (youth, government and media). The present study tries to understand how these different actors conceptualize nuclear technology. It is pertinent to understand why nuclear technology has faced vehement public and political criticism in the post-Fukushima era. The paper tries to explore the dynamics between state and non-state actors. The study tries to articulate the ideological and literary war which is waged between science and society. It will explore the possibility of the participation model which calls for genuine deliberations and respect for local socio-political milieu, knowledge and belief patterns among all stakeholders, thus aiming to resolve the conflict between the two. This model is an effort to challenge the hegemony of science by transferring scientific knowledge from science elites to commons thus bringing about democratization of scientific knowledge and technology.
Designing an array is one of the main tasks during setup planning. Array configuration depends on the goal of monitoring, which may cover a large variety of purposes including monitoring of the desired area, local seismicity, global teleseismic monitoring, etc. In this study, we developed a method to automatically design an array configuration for regional monitoring purposes by optimizing the array response function.
We used the Fuzzy Self-Tuning Particle Swarm Optimization method to optimize the array geometry. Ideally, a well-designed array is defined by an array response function similar to the delta function. We have assigned this criterion as our objective function to introduce the suggested seismic array. In this way, the optimization procedure tries to minimize the power of side lobes by moving the array station's position.
We have implemented a synthetic waveform simulation to verify the suggested array efficiency. In this step, a synthetic event is simulated and relocated using synthetic array beamforming. The suggested array is acceptable if the epicentral difference between the synthetic event and the relocated one is less than 10 km.
Our approach could successfully find the best array geometry and the synthetic event location accuracy is acceptable for different scenarios.
The Provisional Technical Secretariat (PTS) defined a quality assurance programme for infrasound measurement systems of the International Monitoring System (IMS). This resulted in collaborations with international experts in the field of infrasound, metrology, and quality assurance. An objective is to ensure consistency in measurements and equivalence in data produced across the IMS network. In this framework, the PTS coordinated pilot studies with results presented in ITW2017. Following lessons learned from these studies, the PTS initiated the first interlaboratory comparison ever made in the IMS infrasound monitoring range, including sensor measurement of self-noise, sensitivity and frequency response. During this exercise, infrasound sensors, microphones and barometric pressure sensors were circulated for measurements (June 2020 - January 2021). This presentation illustrates measurements from five participants, a description of their calibration/measurement methods, and an analysis of the degree of equivalence between laboratories.
The development of laboratory calibration methods for seismometers and microbarometers in the low frequency range down to 0.01 Hz provides the possibility of a traceable on-site calibration during operation for field sensors of the International Monitoring System. The laboratory calibrated reference sensors can be installed as transfer standards at the stations co-located to the operational sensor, thereby improving data quality and identification of Treaty-relevant events while not disturbing the regular measurements for Treaty validation purposes. At International Monitoring System stations PS19 and IS26 in Germany we performed on-site calibrations tests with both seismometers and microbarometers calibrated at the laboratories at PTB and CEA, respectively, using signals from different natural and anthropogenic excitation sources. Following the approaches of Gabrielson (2011) and Green et al. (2021), the frequency response function of the station sensors including site specific factors such as the wind noise reduction system or possible effects of pre-amplifiers and data loggers are determined. We present calibration results of the comparison between the station sensors with the laboratory-calibrated instruments along with the nominal responses of the sensors. Furthermore, the possibility of a station-wide calibration of seismometers with a single temporary and stationary reference sensor is assessed using suitable excitation signals and station-wide similarity measures.
Focus on CYG as a tool for sustaining global action / interest on CTBT
26 years after its opening for signature in September 1996, the CTBT has garnered 186 signatures from countries in all parts of the world, 176 of whom have now ratified, making it one of the most widely accepted Treaties in the world today. The Treaty has thus not only demonstrated acceptability, it is with its efficient verification regime, also contributing significantly to international peace and security given its ability to detect nuclear weapon explosions, as well as the civil and scientific uses of its data. The Treaty is thus already successful and should be sustained till it enters into force. Against the possibility of wariness or lethargy, the international community with a shared vision must continue to mobilise action, share ideas, and create innovative solutions that allow the CTBT to operate and achieve its objectives. An important instrument in this regard, are the Youths. The CTBTO Youth Group (CYG) remains a veritable platform that can continue to act as voices and agents of the Treaty across the world in their various countries, and as CYG members attain higher roles in their professional endeavours, the CYG of today would become the global leaders of tomorrow. At the same time, at the security, peace, and science nexus of the CTBT, CYGs could play a role in pursuing research in areas of scientific and technical relevance to the Treaty, thus contributing to its technical sustainment into the future. In this regard, the proposed presentation would briefly highlight the evolution of the Treaty – consider the ageing crop of officials initially seized with it and the need for strategically empowering committed young people in all areas of the CTBT/O, the presentation will also seek to identify creative means (thinking outside the box) for youths as torch bearers for the Treaty to sustain action even beyond entry into force.
The empowerment of the next generation of experts, capable of supporting the mission of the CTBTO, both politically and technically, and advancing the universalization and entry into force of the CTBT is a cross-cutting objective. This commitment is equally aligned with the UN system-wide effort of meaningful youth engagement, outlined in the UN Secretary-General’s “Youth 2030” strategy, released in 2018. The strategy encourages the UN system to advance programmatic work not only “for”, but also “with” youth, enabling co-creation and meaningful youth inclusion in decision-making processes and programmatic work.
Since 2016, the CTBTO has been a leading UN system agency in terms of next generation outreach, awareness raising and education, providing the next generation of nuclear disarmament and non-proliferation experts with capacity building, research and professional opportunities. The venues for engagement with youth could be various, from strengthened partnerships with civil society and academia, to providing support for youth-led projects and integrating the next generation into the planning and roll out of major CTBT-related events.
The panellists will highlight the achievements of CTBTO initiatives focused in supporting youth, share best practices of next generation engagement formats, and discuss forward-looking actions. Moreover, it will serve to raise awareness among experts, diplomats and other UN entities, about the role of the young professionals to support the CTBT universalization and entry into force.
The panel will also encourage views and suggestions from the audience, which could result in a post-session roadmap development with innovative suggestions useful for the future synergies between the CTBTO Youth Group, Young Professional Network and other outreach initiatives focused on youth,
Infrasound station PVCI (50.53°N 14.57°E) from the Czech microbarograph network, C9 (https://doi.org/10.7914/SN/C9) is integrated in the Central and Eastern European Infrasound Network (CEEIN, www.ceein.eu). A correct evaluation of detections on the station level is essential for network data processing and localization and identification of events. To improve the knowledge of the infrasound environment of the station we performed a study focused on identification of potential infrasound sources observed at PVCI and on seasonality of the detections.
North-west arrivals prevail in all seasons of the year. Near 0.2 Hz the north-west arrivals are dominated by microbaroms from the North Atlantic, source at 1 Hz and higher can be large offshore wind farms in the North Sea. In summer, signals of long duration and low amplitudes from the south-east are regularly observed. Potential sources of these signals are oil refineries Bratislava (Slovakia), Schwechat (Austria), and Százhalombatta (Hungary). PVCI regularly registers North Sea sonic booms in winter and sonic booms arriving from the Aegean Sea in summer. Local sources, up to the distance of 50 km include mainly mining activities.
A thunderstorm is a well-known and permanent source of infrasound events. Infrasound signal propagation is known as well to be dictated by atmospheric parameters like temperature and wind. In a tropical region like Madagascar, rainfall pattern is one possible technique used to explain seasonal variation of International Monitoring System infrasound network detectability. In this study, 20 years of I33MG infrasonic station bulletin was correlated to rainfall data. Infrasound bulletin is obtained from DTKPMCC software in NIAB collection, atmospheric variations are derived from ECMWF ERA5 data and precipitation data are from the local meteorological service of Madagascar. Observation of rainfall as well as infrasound bulletin from 2003 to 2022 shows a correlation between precipitation and infrasound events.
The atmosphere is changing on a wide variety of timescales. The infrasound component of the International Monitoring System (IMS ) can sense such changes, each second over periods of tens of years. Such long term measurements of atmospheric variability enable the study of climate change. Infrasonic waves passively probe the entire atmosphere. The challenge is to unravel temperature variability (long term increases and decreases) from surface based recordings. As a reference, seismic signals are used which remain unchanged as a function of time. A so-called seismoacoustic analysis uses both seismic and infrasonic signals, where changes in the recordings can be attributed to changes in the medium. The latter being temperature changes in the troposphere and stratosphere.
To illustrate the contribution of the IMS to climate studies, over 15 years of IMS seismic and infrasonic recordings will be shown. The seismoacoustic analysis performed with the recordings reveals long term changes in atmospheric temperature. Simultaneously a temperature increase in the troposphere and decrease in the stratosphere can be passively sensed.
Understanding the impacts of near-field air flow on the long range transport of radionuclides in the atmosphere after a release from an underground nuclear explosion requires improved modeling at the local scale. In this work, we present the design of a radiotracer experiment in a complex terrain environment in support of modeling improvements. A remotely controlled release of Xe-127 gas, produced from neutron irradiation of isotopically enriched Xe-126, provides a radiotracer plume that can be detected by an array of real time NaI(Tl) sensors out to 5 km from the release point. Whole air sampler systems collect gas for later laboratory analysis. A simultaneous smoke release permits tracking of the plume using scanning lidar instruments. An array of meteorological towers provides high resolution wind field data to feed forward and backtracking atmospheric transport models. We will present the design of the experiment and the major elements thereof, including tracer sources, sensors, and samplers.
A massive explosion in Tianjin occurred at a container port on August 12, 2015. The explosion produced a strong infrasound of extremely high signal to noise ratio registered by domestic infrasonic stations several kilometers away. Different from the ordinary explosion infrasound, Tianjin explosion infrasonic signals appear as six consecutive groups of clear first arrival. This reveals the complexity of the infrasound signal propagation path. Detection algorithms based on slowness estimation and association algorithms based on signal envelope are presented following a ray trace processing to discuss the strange signal arrivals that were earlier than normal. The results of signal processing show that the four algorithms are effective. The infrasonic signals of the event have a certain amplitudes and SnR 3500 kilometers away in downwind direction but cannot be observed clearly several hundred kilometers away in upwind direction. Signal characters of more than six sequential signal groups at I34MN, HTI and HMI infrasonic stations are particular compared to the presented explosive infrasonic signals and cannot be explained by ray tracing. Atmospheric profile data from NASA are used showing the complexity of modeling of infrasound propagation. The yield of the explosion is estimated in the end that is equivalent to 400-600 tonnes of TNT.
Using robotic systems for scientific field data acquisition has expanded tremendously in the last five years and includes underground explosion monitoring and verification applications. Robotic platforms offer safer and more affordable ways to collect multiple data modalities, including visual and thermal infrared information, in challenging environments. An example of a new application space for robotics is data collection surrounding Treaty-relevant events within the CTBT verification framework. In these instances, robotic platforms may provide increased data collection efficiency over broad areas, relative to other surveying methods, to facilitate anomaly and artifact observation. We present analyses from multiple underground conventional high-explosive experiments in Nevada, USA, where robotic platforms obtained time-series imagery data before and after the experiments, using commercial off-the-shelf cameras. Using structure-from-motion photogrammetry, we can identify high resolution surface changes and other anomalies and artifacts resulting from these underground explosions. This presentation examines the use of such tools and technologies to characterize underground explosions with different emplacement conditions. Further, we highlight (1) the range of anomalies and artifacts that can be identified using imagery, (2) the training and readiness required for appropriate robotic data collection system use, and (3) the challenges and limitations in adoption of robotic systems in the verification regime.
In environments where there is relatively little available open source information the analysis of satellite imagery data can be of particular value in monitoring for indications of nuclear weapon test preparation activities, such as signs at suspected nuclear weapon test locations of increased vehicle traffic, construction of support buildings, and tunnel excavations. This paper will use the People’s Democratic Republic of Korea Punggye-ri nuclear test site as a case study and will explore various imagery types, methodologies, and processing techniques of satellite imagery for nuclear test site monitoring. Additionally, this paper will assess how other open source information can support satellite imagery focused monitoring efforts.
8-9 video rooms in parallel for topics: P1.1, P2.2, P2.4 (1-2 video rooms), P3.3, P3.5, P5.1, P5.3, P5.4
The PS26 is the first internationally recognized seismic station in Niger. The field study for the site installation was conducted in November and December 2001.
The PS26 station is located southwest of Niamey, near the town of Torodi. This region is part of the West African craton, composed of sedimentary belts of Birimian green rocks and granitic terraces of the late Archean or Neoarchean (2.0-2.1 Ga). The 16 sites are distributed over three concentric rings at distances of 0.75 km, 1.5 km and 3 km.
Seismometers are placed at about 50 meters depth. A GPS antenna is mounted on the roof outside each bunker of the installations. Among the 16 sites four sites including TOAO, TOC2, TOC4, TOC6 are in three components.
We will extract data from these three components at each of these four sites so that we can prove that the three components allow us to locate relatively events. This approach consists in demonstrating and assisting those who do not have the possibility of having data available at the levels of several stations with restricted data to characterize events.
Myanmar is situated in a tectonically complex area. However, up-to-date instrumentation and communication infrastructure limit earthquake monitoring and research activities. In 2008, two broadband stations financed by China's Earthquake Administration were installed. With government funding, we set up 3 broadband stations in 2010. The Myanmar National Seismic Network code MM underwent a significant upgrade in order to increase earthquake monitoring capability in 2016. The United States Geological Survey also supported and installed five broadband and strong-motion seismic stations and real-time data using recently improved cellular networks. In 2017, with financial support from India and technical assistance from the RIME, we installed ten broadband stations. And in 2010, we set up a single SIM-supported broadband station. We provided seismic data of 4 stations in these 11. One of our broadband stations (NPW), which was updated by GFZ in 2019, was shared on the GFZ network. Therefore, we have access 10 stations from global networks. We have 19 broadband and real-time data sources via cellular networks, and continuously monitor our seismic network and provide seismic news for our nation.
OVSICORI is the institute in charge of earthquake and volcanoes monitoring in Costa Rica. As with other countries in Central America we have limited economical resources. Due to important government investment on 2010 we chose Antelope by BRTT as our main monitoring computational system. Antelope worked great for us at the beginning but in the following years we found out that most of the countries around us were using SeisComP, which in some way, isolated us from sharing data and knowledge with our neighbors. SeisComP was growing faster than Antelope and soon we started to find some procedures easier on SeisComP than on our system.
In 2018, CTBTO invited us to the Basic SeisComP Training which clarified our minds about this system. We could say that this course was the beginning of the transition for us. In 2020, COVID-19 did the last move for us, affecting our country's economy and so Antelope’s license didn't seem to be affordable in the near future. At the end of 2020 the transition to SeisComP at our institute started.
As part of nuclear non-proliferation and disarmament measure by the UN General Assembly, the Comprehensive Nuclear-Test-Ban Treaty (CTBT) was adopted to prohibit nuclear test explosions and any other nuclear explosions in all environments (in the atmosphere, oceans, and underground). Thus, ultimately to enhance international peace and security. The International Data Centre (IDC) offers Capacity Building System (CBS) support to National Data Centres (NDCs) of States Signatories, primarily in developing countries for technical assistance to facilitate their role towards the Treaty. The Republic of Ghana is a beneficiary of the equipment support under this programme. NDC - Ghana took delivery of the equipment set, the third generation of the Global Communications Infrastructure (GCI-III) / very small aperture tunnel (VSAT) from the IDC for the verification regime. It was successfully installed and commissioned for operation in July 2021. This newly established VSAT link is an essential element for the monitoring data forwarding to NDC – GH upon request. Hence, the CTBT beyond its global prohibition measures against all nuclear explosions, it is offering strong CBS support to State Signatories as well. Thereby providing relevant technological advances in monitoring geophysical hazard, for example earthquakes to participating States as a non-treaty relevant benefit to society.
Mali signed the Comprehensive Nuclear-Test-Ban Treaty on February 18, 1997 then ratified it on August 4, 1999. In 2005 and within the framework of the activities of the International Monitoring System (IMS), the CTBTO implemented a National Data Centre (NDC) at the Direction Nationale de la Géologie et des Mines (DNGM) of Mali with partners such as the Malian Radiation Protection Agency (AMARAP). The DNGM is responsible for the seismic, infrasound and hydroacoustic monitoring of ISS.
The main mission of AMARAP is to ensure the protection of population and environment against the harmful effects of ionizing radiation. It is the only structure in Mali able to ensure the radionuclide monitoring in the air. After the recent workshop in which AMARAP was represented entitled "2022 National Data Centers Workshop in Toledo (Spain) in October", AMARAP began the process of setting up a radionuclide and noble gas monitoring station to revitalize the national NDC team. More details will be in whole presentation.
Jordan's foreign policy continues to open up to the countries of the world and international organizations, interact with them, exchange experiences, and cooperate in various fields to develop the capabilities of the Jordanian state and provide the best services to Jordanians at home and around the world.
Jordan deposited its instrument of ratification of the Comprehensive Nuclear-Test-Ban Treaty (CTBT) with the Secretary-General of the United Nations on 25 August, 1998. Jordan is the twentieth signatory State to have ratified the Treaty. Jordan is providing one seismic station to the International Monitoring System (TEL-AL-ASFAR (AS056)).
On 11 November, 1999, a facility agreement was signed between the Permanent Representative of Jordan and the Executive Secretary of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO).
In cooperation with the CTBTO, Jordan hosted several workshops, the objective of the workshops was to address specific topics of interest for States Signatories and the Provisional Technical Secretariat (PTS) related to the establishment of the verification system and to advance interaction between the PTS and National Authorities and National Data Centres.
A major development hit the Jordan Seismological Observatory after the cooperation and workshops with CTBTO in terms of the number of stations and seismic severity studies.
Under the financial support of the Norwegian Center NORSAR provided in 2020-2022 within the joint project, the Kazakhstan National Data Centre (KNDC) upgraded its hard- and software in support of the CTBTO tasks implementation. A new state of the art equipment was installed at the centre in Almaty. A fail-safe information system was created to ensure the following:
- acquisition and storage of seismic data arriving directly from the seismic monitoring stations, and obtained from different sources (software for data acquisition and storage in the databases PostgreSQL and MySQL, seedlink server functioning);
- operation of the file server to store unprocessed data in different formats;
- functioning of servers ensuring the all-round operation of the Data Center (web-server, mail server, FTP, etc.);
- informational safety of the Data Center (firewall, proxy);
- implementation of seismic processing (automated and interactive processing) and infrasound data processing (automated detection).
The new equipment allowed building a cluster of virtualization servers consisting of three high-end servers under the Proxmox virtualization system control. To ensure the fail-safe operation of the system, the cluster uses the distributed file system CEPH and data storage systems based on RAID technologies.
Capacity building is fundamental for strengthening the CTBT verification regime. The Commission is supporting the States Signatories in providing means to develop capabilities to actively participate in the CTBT verification regime. The donation and installation of IT equipment, the so-called Capacity Building System (CBS), at National Data Centres (NDC) of States Signatories is a significant component of the capacity building project. The equipment is provided to NDCs, mainly in developing countries, to support the establishment and further development of national capacity to participate actively in the verification regime by accessing and analyzing IMS data and IDC products. The CBS project was initiated in 2009. Since that time, almost 70 NDCs have already received and installed the system and established their routine operations on IMS data processing with the NDC in a box package. The poster describes the current status, achievements and historical development of the project, which is funded by the European Union and the Provisional Technical Secretariat.
Since the different seismological stations of the CTBT have been successfully developed and installed around the world, a set of experiences has been accumulated that have been used to build a better management system, inside and outside the CTBT. This knowledge base has been used at the national level of each country, where personnel and operators are part of the scientific and seismological research community, improving their own seismological networks, waveform analysis times and methodologies, applying the best practices learned from the CTBT forged during trainings (capacity building system) and socialised during a number of meetings promoted by the organisation. This paper presents a case study of the cumulative experiences gained by a hundred Venezuelan staff and operators in 20 years who have built a better local National Data Centre (NDC) as well as a much better national seismological network for their own country.
To monitor compliance with the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a verication regime is designed to detect any nuclear explosions conducted anywhere –underground, under water or in the atmosphere. In this poster, we will focus on the Libyan contribution to the main component of the verification regime as the progress in the installation of the International Monitoring System (IMS) radionuclide station RN041 (Misratah, Libya), progress in the establishment of the National Data Centre and the use of IMS data and International Data Centre products in the context of NDCs4All announced by the CTBTO Executive Secretary and the progress in building on-site inspection capacity in Libya. We will present the previous, current achievements and the expected results, as well as the encountered difficulties fulfill Libyan participation in the verification regime.
The Pelindaba Treaty which aims to create a nuclear weapon-free zone in Africa entered into force in July 2009. The African Commission on Nuclear Energy was created for the implementation of the treaty’s provisions and to develop nuclear sciences and techniques. In this context it also consists of political statements of many African countries aiming to embark on nuclear power programs and contrasting with a real lack of human resources in most nuclear application fields, including those related to non-proliferation and verification regime skills. And other on-site inspection activities, the partnership of CTBTO which is already fruitful in many fields as data analyses, OSI activities, maintenance of equipment may play key roles at different stages and levels.
The CTBTO cooperation on sensitization and support to train a critical number of persons, including decision makers, technical staff and operators is a key point to help countries to comply with international tools on safeguards, non-proliferation, on-site inspection skills for their programs and to provide with the regional legal instruments the necessary human resources to boost their implementation.
In 1999, Jordan signed the Facility Agreement to establish and upgrade the seismic auxiliary monitoring station hosted in Jordan to implement the Comprehensive Nuclear-Test-Ban Treaty (CTBT). Jordan signed the treaty on 26 September 1996 and ratified it on 25 August 1998. Since the Jordan National Data Centre (NDC) was established in 2004 and started receiving data from International Monitoring System (IMS) data from International Data Centre (IDC) products, the NDC utilized the NDC in a box package, IMS data, and IDC products for analysis of the seismic events by requesting data from IDC products and using it to relocate the events by using different software. As a result of the application of IMS data and IDC products, the NDC can strengthen its in-house capability, particularly in data analysis, and participate in any NDC-related exercises organized by the CTBTO. The poster aims to present Jordan NDC history (signature, rating, and facility agreement), capacity building systems (CBSs), and AS056 Tal-Alasfar station (monitoring, operating, maintenance, and PTS technical assistance), MS data request, and data analysis using NDC in a box package software, participation in workshops, training and exercises organized by the CTBTO.
Since 2002, when Bolivian International Monitoring System (IMS) stations (IS08, LPAZ, and SIV) began transmitting data, the Bolivian National Data Centre (NDC) has made an effort to effectively participate in the verification regime of the Treaty using IMS stations and NDC in a box programs for integrating data acquisition, processing and analysis. Thanks to the different trainings of CTBTO, the real time integration of the local seismic stations IMS and Operations Support Centre has been carried out to monitor local, regional and global seismic and infrasonic events through the SeisComP4 acquisition system. Most analysis of events data has been carried out, such as: daily routine, analysis of the Peru earthquake of May 26, 2022 using stations IS08, IS41, IS20 (infrasound), PS06, AS008, PS07, AS078, AS019, AS077 (seismic) and explosion of the Sangay volcanoes (2021 - 03 - 11) and Tonga-hunga ha'apai (2022 - 01 -15) through infrasound stations: IS08, IS09, IS41, IS20, IS14 and IS01. In addition, in recent years, all data analysis programs such as Geotool, DTK-GPMCC, DIVA, SeisComP4 have been updated so that NDC-Bolivia users can analyze data.
The new National Data Center (NDC) Capacity Building Training cycle on the access and analysis of International Monitoring System (IMS) radionuclide data was launched in 2022 based on the collected feedback during the previous trainings and meetings. The Capacity Building and Training (CBT) Section with cooperation with software application took the initiative to design and draft the program of the new training.
The cycle includes three consecutive trainings starting with a introductory course on radionuclide IMS data and International Data Centre (IDC) products for both particulate and noble gas. The introductory course is open to all NDCs technical staff. It aims at providing basic knowledge for accessing and using radionuclide IMS data and IDC products, as well as familiarizing NDCs technical staff with software tools for the analysis of IMS radionuclide data. The other two advanced training courses on radionuclide particulates and noble gas data analysis, respectively, are open to NDCs technical staff who had attended the introductory course or those having a good experience in radionuclide data analysis. The main objective of the advanced training is to further strengthen the analytical skills of NDCs technical staff on access and analysis of IMS radionuclide data.
Recent upgrades to the seismic network in Jamaica has significantly improved the country’s capacity for seismic monitoring and scientific research. The network now includes 13 broadband sensors, 12 short period seismometers and 42 strong motion accelerographs installed across the island. Devastating historical events including the infamous 1692 and 1907 earthquakes underscores the need for Jamaica to strengthen its monitoring and research programs. The Earthquake Unit is now well positioned to make further contributions to national development and by extension the Caribbean region. Data from local and regional networks will be incorporated with CTBTO’s International Data Centre data from the IMS for civil applications and research. These advancements in conjunction with the use of SeisComP for real time data acquisition and processing will contribute to developing a robust monitoring program for the country where rapid hypocentre and magnitude solutions for events can be achieved. The upgrades will also support research initiatives and assist with refining the velocity structure model for the island along with focal mechanism solutions to better understand the seismicity and tectonics of the northern Caribbean. In addition to supporting microzonation surveys, the improved network will also be utilized for developing a tsunami early warning system and local seismic hazard models.
Although the CTBTO has primary and auxiliary seismic stations in the International Monitoring System (IMS), it may be interesting to include local stations for the NDCs, but these stations must comply with the quality parameters imposed by the IMS. The stations of this proposal are those managed by the CNS-UASD network (PCDR, SMDR, MCDR and SODR) that belong to the Dominican Republic (DR) and Puerto Rico (PR).
This investigation will address the requirements that the IMS applies to seismic stations and will be applied to the stations of the local network. This research will serve as a pilot for local networks to have a point of comparison and in turn raise the quality of their stations, which are used by local NDCs for their civil applications.
Having good quality stations translates into reliable measurements, just as the IMS does by treaty mandate. This study seeks to homogenize the local stations with those of the IMS, so that they can receive some certification according to the CTBTO requirements so that in the future they can be consulted as part of specific CTBTO studies and thus reduce the gap in some areas or be able to detect events minors.
Seismic monitoring is one of the most common monitoring regimes used and the biggest network at CTBTO, with a total of 50 primary and 120 auxiliary stations. This is due to the more understandable nature of interaction between the seismic waves and the heterogeneous earth's subsurface. Data obtained from the primary and auxiliary seismic stations across the globe is automatically processed by sophisticated systems put in place in Vienna, Austria. However, even with automated processing, two distinct events may be merged into one while a single event might be split into two different events. Also, some events can be completely omitted by the system. This requires analyst to be conversant with several data processing tools and techniques. SEISAN is a simple interactive tool that can be used to analyze seismic data. This abstract gives an overview of the SEISAN analysis tool using example waveform data obtained from one primary station (KMBO, Kenya) and two auxiliary stations (MBAR, Uganda and FURI, Ethiopia) for a natural event dated 2020-05-03 19:36:55 (UTC) in Lodwar, Kenya.
In 2012, thanks to synchronous work between CTBTO and FACEN - UNA (Facultad de Ciencias Exactas y Naturales de la Universidad Nacional de Asunción), the National Data Center of Paraguay (NDCpy) was set up and inaugurated in the FACEN seismology laboratory. In this first stage, it had a server and a PC. Our two stations (CPUP and IS41) and the primary seismic CTBTO stations in South America are also received in real time. In an update in 2018, two more PCs were added for analysis, new analysis software, and existing software were updated. Auxiliary CTBTO stations were included as well and was also configured to receive x stations from the IRIS network in real time and also thanks to a joint project with USP of Brazil Four new ones were installed and received at NDC in the territory of Paraguay as well as xx more stations distributed in Brazil, in addition to having access to more data through NMS_clients. With those improvements the regional seismic monitoring is covered, and local location events were improved. Future plans are planned in order to continue updating and improving the seismic monitoring work.
Solomon Islands is located on an active zone of seismic activities where Indo-Australian Plate subducts beneath the Pacific Plate at a rate of approximately 95mm/yr towards the north-west. Solomon Islands is a double chained of islands that lies between Papua New Guinea in the North West and Vanuatu in the south-east.
The purpose of this paper is to present the distribution of the potential risky earthquakes that occurred in the Solomon Islands from the past 90 years (1932 - 2022) to date.
This presentation gives examples on how a National Data Centre (NDC) can automatically detect seismic events data obtained in near real time from the IMS through the SeedLink service available on the Global Communications Infrastructure (GCI), using the seismic analysis software NETDET that is part of the SEISAN package. We also show how to integrate the GCI data with locally collected data and how to analyse the data to obtain event locations and magnitudes. SEISAN (see http://seisan.info) is used in more than 30 countries mainly at smaller seismic networks or by students or researchers, for processing data from permanent or temporary seismic networks and at a number of NDCs. NETDET is a newly developed tool for SEISAN that enables event detection and location in near real time or offline. At an NDC the data flow from the GCI is send with SeedLink to a SDS or BUD miniseed archive structure from where NETDET loads the data and preform data processing and event analysis automatically. Dependent on the parameterization of NETDET location and magnitude estimations will be performed. The configuration of the data flow and the possible parameterization of NETDET will be presented.
While the spectre of nuclear war still haunts Europe in the light of the Russia-Ukraine conflict, the role of the CTBTO in this conflict is not clearly defined in the media in the Middle East and North Africa (MENA). This hypothesis is based on observations and content analyses where few articles are dedicated to CTBTO. In my paper, if accepted, I will present the image of the CTBTO: how it is perceived, how it should be strengthened and improved in order to play an important role in the world in reducing nuclear threats and disseminating information about nuclear weapons. In sum, it is important to draw the attention of the public and inform them about the actual attributions of the CTBTO as a potential actor in preventing nuclear disaster. An online survey of a sample of citizens, journalists, academics and politicians will be used for the results.
Even though the CTBT has not entered into force yet due to the inability of the remaining Annex II States to ratify it, its contributions to global peace and security has been acknowledged as important, particularly in the significant reduction of global nuclear weapons tests. A key hindrance to the global efforts towards the entry into force of the treaty thereby pushing it close to universalization has been the non-involvement of more women scientists and experts at the fore front of the activities of the treaty implementation. In recent times, there have been calls within the CTBTO family for women to be given important positions in activities towards the total elimination of nuclear weapons. It is our belief that this effort is not adequate and more needs to be done to get the women more involved so that they can add their voices to the call for the total universalization of the CTBT and the global promotion of the CTBT verification system. As a practical strategy of getting more women in this initiative, this paper identifies the greater participation of women in STEM science and in other CTBT related science and technology activities.
Strategic communication is important to create public awareness in every country about the devastating impacts caused by the use of nuclear weapons to eliminate the presence of nuclear weapons across the globe. Every community, nation, and region has some historical backgrounds and indigenous traditions that must be kept in mind while designing and selecting the mediums for public outreach programs. The topics on the socio-economic negative impacts of the use of nuclear weapons can be included in the curriculum at universities and colleges to apprize youth. Comprehensive awareness campaigns can be launched through opinion leaders in our respective countries to highlight the importance of the issue. TV and radio programs and articles in national and local languages can be useful tools to educate the masses at the grass root level. Social media influencers can be engaged for wider outreach of messages. Interaction with youth at universities and colleges through workshops and training can give them a better understanding of the problem. Moreover, activities like international exchange programs, painting/poster/debate competitions simulation exercises among youth, and celebrations of “International Day for the Total Elimination of Nuclear Weapons” can apprize the masses about devastating impacts of nuclear weapons.
We propose the creation of a pilot, scalable, university-accredited online course, with both asynchronous content and synchronous workshops, to better educate students worldwide on the CTBTO’s mandate and nuclear non-proliferation, repurposable in other contexts. The themes of the course would include the historical and political context of nuclear weapons and arms control, the technical aspects of weapons and delivery systems, and the governance mechanisms and non-proliferation regimes in place, such as the CTBTO.
Nuclear non-proliferation outreach from the CTBTO must be accessible, interdisciplinary, and inclusive. Our course would be co-run by a small coalition of professors from various universities worldwide, and members of the CTBTO Youth Group, who would supervise and facilitate weekly workshops, culminating in an online conference with all participants. Professors from both ratifying and non-ratifying countries would create asynchronous content, encompassing political science, nuclear physics, health sciences, and international law. Workshops would be conducted on an online platform, employing interactive activities, such as role-playing historical events, employing game theory to address security concerns, or applying scientific concepts to current issues or potential conflicts.
Our presentation at the SnT conference will be done in an interactive format, employing the same learning strategies used in our proposed course.
Individuals with disabilities remain significantly underrepresented in science, technology, engineering, and mathematics (STEM) fields. Recent data from the National Center for Education Statistics (NCES), reports that roughly 20% of all undergraduates within the United States report having one or more disabilities. Among the initial undergraduates that self-reported as having a disability, when tracked 10 years out: only 10% went on to receive undergraduate STEM degrees, 6% went on to earn graduate STEM degrees and 2% went on to earn doctoral STEM degrees. Additionally, the European Disability Forum found that only 30.9% of learners with disabilities went on to tertiary education. This data suggests that STEM education and related fields struggle to recruit and retain disabled students and experts, which leads to a lack of input and innovation from potential disabled scientists. This report explores the roadblocks keeping disabled students from reaching their full potential in STEM related fields and suggests ways in which the global scientific community can improve accessibility and promote the inclusion of disabled scientists to meet both UN Disability Inclusion Strategy initiatives and Sustainable Development Goals.
After more than 20 years of its opening for signature, CTBT is facing ever-decreasing attention from the younger generation according to my observation while I have been teaching some related courses in my university, Bejing Language & Culture University, China, for more than ten years. Therefore, it’s very necessary to make more efforts to educate the young people to raise their awareness and support for CTBT’s early ratification. Firstly, I will increase contents on CTBT in my courses for graduates and undergraduates. Vast valuable materials and information on CTBT’s history, progress and challenges which block its entry into force from the website of CTBTO are much helpful for my teaching and discussion in class. Secondly, I promote my students to keep their attention and enthusiasm on any new developments of verification technologies in CTBT related field and its civil and scientific applications. Thirdly, I welcome young people to join my daily research work, which involves CTBT and nuclear non-proliferation issues. Some students may choose topics related to CTBT for their theses and continue paying attention to this issue afterwards. Fourthly, I will continue to encourage my students to participate in CTBTO’s virtual events such as online courses, and any other related activities such as Model of the United Nations.
The study aims to understand how the government and media are reshaping nuclear discourse in India. An attempt is made to recognize the shift of discourse from theory to praxis. The study tries to understand how the rise of nationalist populism enhances the nuclear dangers and sees how the very idea of responsible and non-responsible nuclear states distinction is flawed. It will try to look at the impact of populist decision making on nuclear order. How the nuclear rhetoric used by nationalist populist leaders impact the nuclear crisis is the bone of the contention of the study. The first no-use policy of India has various implications and connotations attached to it and why it is no more relevant in present context will be dealt upon. The current political leadership in India has deepened the ongoing debate for autonomy in various atomic departments. However, the arrangement between Atomic Energy Regulatory Board of India and Department of Atomic Energy has undergone fervent criticism as former is subservient to the latter. This calls for separation of powers, which still is a far-fetched and unrealized goal to be achieved.
Despite not signing the CTBT, Islamabad actively contributed to negotiations that led to the Treaty’s finalization at the Conference on Disarmament, and considers CTBT as the “primary mechanism for ending nuclear weapons testing.” However, outreach efforts face headwinds due to Islamabad’s heightened skepticism of India, its nuclear-armed neighbor, and what it sees as New Delhi’s limited headway on a proposed mutual non-testing arrangement. These developments continue to be perceived as obstacles to Pakistan’s signing of the CTBT, which insists on reciprocity from India, making it imperative to evolve CTBT outreach in a way that helps manage some of those expectations. This presentation takes Southeast Asia as a case in point to argue how CTBT outreach efforts can become more localized and effective in Pakistan’s strategic community. It draws lessons from several Association of Southeast Asian Nations (ASEAN) states, which reached a consensus on CTBT ratification despite some divergence on nuclear escalation and nonproliferation risks. The presentation will illuminate specific local nonproliferation and advocacy partnerships that made such a consensus possible, their favorable impact on ASEAN’s CTBT stance, and how these can be tailored to the benefit of CTBT outreach in Pakistan.
The role of the CTBTO is to detect any atomic explosion on the planet. It also has the mission of convincing States to renounce nuclear testing. The implementation of a good communication strategy is a major challenge. For a communication strategy to be relevant, it must be based on a good analysis of the situation the objectives that can be set, the desired results and impact and the possible communication solutions.
Communicating for a complete ban on nuclear testing must address the following challenges:
1. Create a network of communicators to raise awareness and inform the population about the benefits of the nuclear test ban.
2.Mobilize the necessary resources for those responsible for communication.
3. Cooperate with civil society actors, influencers for a better understanding of the program.
Promoting the benefits of a nuclear complete test ban will require:
- Having a clear, approved and scrupulously implemented communication strategy and plan.
- Allocating a substantial budget to the communication activities.
- Mapping stakeholders around the following major groups: policy makers, opinion leaders, youth and women's organizations.
Since 2014, University of British Columbia professors Allen Sens and Matt Yedlin have taught the transdisciplinary course, “Nuclear Weapons and Arms Control” to a mixed cohort of engineering and political science undergraduates. A thousand students in total have experienced our flipped classroom model which integrates science and policy content, with a particular focus on the CTBTO. The course makes extensive use of bespoke instructional videos, and video analytics from YouTube and EdX have demonstrated the elevated level of student engagement using this pedagogy. As such, our course has informed us of the lessons learned and best practices for a model that can be leveraged for CTBTO outreach and public education efforts. This includes the creation of open online public education modules and a possible open online course on nuclear weapons and arms control. We propose the development of CTBTO educational modules and an open online course based on a comparison of our experiences delivering our course in face-to face delivery in 2022 and our 2020 online offering during COVID-19 restrictions, the latter serving as a proxy for an open online model. This model includes the development of in-class simulation exercises into the online environment.
At face value, Indonesia seems promising for women interested in working in nuclear security, peace negotiations, and non-proliferation. Women make up about 40% of total employees in related governmental institutions and in academia. However, diplomacy, nuclear security, and non-proliferation are perceived as prominent "masculine" activities. In addition, the belief that each field is mutually exclusive results in a competency gap between those who work in security and those who work in non-proliferation, when in reality they are complementary aspects of the same whole. This outreach effort was to encourage women to acknowledge their valuable expertise and to bust several sexist myths, targeting women both in academia grade and security and non-proliferation career. The result of this outreach shows that the participants are fully acknowledging the challenges of being a woman in the "masculine" career of nuclear security and safeguard. Also, there is a significant increase of comprehension towards the nuclear security and safeguard subject. In the post-training survey, the participants are committed to be the experts in each respective career, while helping their fellow female work colleagues to shine in their career.
Ghana is steadily progressing towards the accomplishment of her Nuclear Power Program. The Nuclear Regulatory Authority was established by Act 895, 2015 to provide for the regulation and management of activities and practices for the peaceful use of nuclear material and provide for safety environmental protection and of persons against the harmful effects of radiation hazards. Ghana ratified the Comprehensive Nuclear-Test-Ban Treaty in June 14, 2011. Prior to the ratification, the National Data Centre was established in 2010 to receive and use data for the verification and compliance to the Comprehensive Nuclear Test Ban treaty (CTBT) and also to advise national authorities in Ghana on issues relating to earthquake disasters and their management. The promotion of the activities of CTBTO in Ghana and most importantly its promotion to the younger generation and early career scientists is critical in ensuring the sustained and continued efforts of the Nuclear-Test-Ban regime. This study highlights the efforts in place to keep the younger generation informed and involved in promoting activities of the CTBTO.
With the first articles published in late October 2022, and more in press, Seismica is becoming a free and open venue for the seismological community to publish results from the entire spectrum of seismological research and applications. Our mission is to publish original, novel, peer-reviewed research in the fields of seismology, earthquakes, and related disciplines. Seismica is a community-driven, diamond open-access journal, meaning articles are free to publish and free to read, without a subscription. Additionally, authors retain full copyright. Equity, diversity and inclusion are central to the goals, policies, and procedures of Seismica. Seismica is for everyone. Seismica is run by a global community of volunteers. Visit seismica.org to join our mailing list and reviewer database. The latter makes it easier for editors to easily identify willing reviewers from a diverse pool of scientists. Seismicia endeavours to become a truly global publication, with authorship from across disciplines and continents. Author Guidelines for step-by-step instructions are available at: https://seismica.library.mcgill.ca/author-guidelines.
Seismica also publishes fast reports on recent seismic events, new techniques and algorithms for seismic data processing, among others. Seismica is, therefore, an accessible option to publish seismic results presented at the CTBT Science and Technology Conference 2023.
Nuclear testing has had devastating effects on cultures, histories, and populations worldwide. The disproportionate impacts on marginalized populations within nuclear armed states are largely excluded from mainstream narratives, which instead justify nuclear proliferation through propagandizing themes of scientific progress, military prowess, and political prestige.
The project aims to contribute an intersectional perspective through depicting the obscured narrative of underrepresented and marginalized communities, who face the greatest impacts while reaping the fewest rewards from the economic benefits and political empowerment that form the justifications for nuclear testing. We aim to highlight these experiences through creating and allowing an audience to play an interactive game that demonstrates states’ manufacturing of consent from the general population to enable their exploitation of marginalized communities. Through this, we show the fundamental role of propaganda in promoting nuclear test narratives, as well as the importance of empowering underrepresented narratives and transparency for the prevention of nuclear testing.
In contrasting the narratives promoted by the state with the testimonies of marginalized populations, this project aims to expose the true function of these justifications: as instruments of coercion used to uphold colonial, exploitative, and oppressive institutions.
Current international diplomacy cannot secure a state’s support of the CTBT without the internal citizen-based support of the Treaty. In most regime types, the citizen masses are often vital to influencing government decisions. For instance, in the late 1960s, anti-nuclear testing groups were instrumental in bringing the issue of nuclear disarmament to national elections. Alongside international support, the public voice greatly influenced legislation on nuclear disarmament in nations across the globe. The CTBTO can look to the precedent of citizen mobilization to increase global outreach on nuclear disarmament. Through the establishment of national citizen-based branches of the CTBTO, the Organization can promote greater public engagement and understanding of nuclear test explosions. In recent years, citizen mobilization has taken new and innovative forms, such as the utilization of social media. The branches can oversee the development of social media information campaigns, as they may possess a greater understanding of the local diversity to run social information campaigns effectively. The branches would act similarly to the National Authority in ratified states, by maintaining liaison with the Organization. Although CTBTO citizen groups may not be feasible in all nations, implementing citizen-based branches can greatly improve the chances of ratification in signatory states.
After Entry into Force, the Conference of States Parties shall according to CTBT, Article II, paragraph 27 (f) consider and review scientific and technological developments that could affect the operation of this Treaty. According to the CTBT, Article IV, paragraph 11: “Each State Party undertakes to cooperate with the Organization and with other States Parties in the improvement of the verification regime, and in the examination of the verification potential of additional monitoring technologies such as electromagnetic pulse monitoring or satellite monitoring, with a view to developing, when appropriate, specific measures to enhance the efficient and cost-effective verification of this Treaty.”
Specifically, the IMS may be expanded beyond the four sensor technologies of the current IMS. National Data Centres don’t need to wait since they have no limitations in which sensor technologies they apply in analysing announced or potential nuclear explosion events like optical and radar satellite observations. Also, the IDC conducts the requested service of Expert Technical Analysis on IMS and other relevant data provided by the requesting State Party to help the State Party concerned to identify the source of specific events. Therefore, additional monitoring technologies are not a hypothetical possibility for the future.
For On-site Inspection, the obligation is to use only those techniques specified in the Protocol, Part II, paragraph 69 that deals with the Inspection Activities and Techniques. A proposal for the first comprehensive draft list of equipment for use during OSIs was presented by the Provisional Technical Secretariat (PTS) of the Commission in 2021. It covers all permitted inspection activities and techniques except for drilling. Advances in science and technology have influenced the proposed specifications of OSI equipment since the CTBT opened for signature and will continue to do so. Gaps like drilling still need to be closed, data fusion and automation to be enhanced and it needs to be ensured that observables that are associated with nuclear explosions conducted in environments other than underground can be detected.
This panel will discuss which potential additional technologies would be useful and what progress has been made in demonstrating them.
Land based networks deficit the low-magnitude seismic events associated with magmatic and tectonic processes at mid-ocean ridges due to expeditious attenuation of seismic waves in the solid Earth. However, low frequency T waves generated by such events travel over very long distances in the sound fixing and ranging (SOFAR) channel with little attenuation and can be monitored by regional underwater hydrophone networks. Hydroacoustic data recorded by autonomous hydrophones of OHASISBIO temporary network along with three permanent hydroacoustic stations of IMS-CTBTO at Diego Garcia, Cape Leeuwin and Crozet in the Indian Ocean, is analysed to detect magmatic and/or tectonic seismicity. Such T wave signals are commonly manually picked to determine the location and origin time of seismic events. In this way, we have scrutinized several seismic swarms in the Indian ocean over the last 12 years. However, this process is cumbersome and its efficiency differs from one user to another. To overcome this difficulty, we have started to develop an automatic underwater acoustic signal processing algorithm. In a preliminary step, we tested a supervised learning method by training the model on subsets of manually processed T wave catalogs. The next step is to apply the algorithm to extended catalogs and to evaluate the completeness of the detections.
Currently, ocean acoustic models codes provide accurate predictions of sound propagation for various realistic scenarios. However, for long range propagation of signals generated by airgun arrays, some issues remain unsolved. Most implementations apply to isotropic sources but airgun arrays are strongly directional. When modelling their source level, this directionality must be correctly coupled to the computed acoustic field. Typical two-dimensional implementations don’t account for the variability of the coupling of source energy to the sound channel due to bathymetric features in the survey area, nor diffraction effects which often constitute a significant concern. Sound pulses from airgun arrays generate low frequency signals that under favourable conditions are received at the CTBT hydroacoustic stations with high signal to noise ratio. The International Monitoring System (IMS) provides useful data to evaluate acoustic propagation and source level models when comparing predictions with registered signals. Initial results presented at the SnT2021 as a work in progress are deepened here through spectral and time based analyses of airgun shots recorded by the IMS during oil and gas surveys in Argentina Basin North. A 3-D model, instead of a 2-D one, is used for evaluating its capability of capturing bathymetric effects in the near field. Additional techniques to account for the far field directionality are discussed.
In addition to monitoring the oceans for signs of nuclear explosions, International Monitoring System (IMS) hydroacoustic data have been used for a broad range of civil and scientific applications, including the study of submarine earthquakes. This work analyses T wave signals recorded at CTBT-IMS hydrophone station HA3 (Juan Fernández Islands, South Pacific Ocean) triggered by Kermadec earthquakes from 2014 to 2022. These T waves present complex arrival characteristics at HA3, and different arrivals within the duration of the earthquake signals can be identified. Earlier arrivals are due to the conversion from seismic to acoustic waves far from the epicenter in a generation zone covering the chain of seamounts in the Louisville Ridge and the Kermadec Trench slope. Later arrivals are mainly due to reflected propagation paths induced by bathymetric features along the about 8000 km path from the epicentre to HA3. Although the analysed earthquakes had different magnitudes, epicenters, and fault mechanisms, their T waves present similar arrival characteristics at HA3. A discussion is provided on the similarity and correlation of the arrivals.
The CTBTO verification system exists within the broader context of international organizations, global policy making and international collaboration as well as public awareness and safety. The CTBT International Monitoring System has almost been completed and the readiness for On-site Inspection has been demonstrated. This panel explores what lessons learned can be transferred to other control regimes in the domain of weapons of mass destruction and vice versa. Advances in science and technology can drive progress in advising on policies and solutions based on data and evidence and can impact confidence building. Further, this panel addresses applications of verification technologies and identifies innovative solutions for change within the framework of the CTBT as well as other relevant agreements and arrangements. Specifically, the following questions and issues may be addressed:
- What does verification mean in the different regimes and how is (non-) compliance assessed (e.g., different principal approaches/concepts adopted)?
- How to keep abreast of advances in S&T and apply them for verification purposes.
- Lessons learned from different regimes (incl. mechanisms adopted (e.g., TWG, high level panel, technology foresight etc).
- The role and contribution of civil society and industry towards advances in verification capabilities (ref. UNIDIR study).
Opening of the Closing session by Dr. Robert Floyd, CTBTO Executive Secretary
Conclusions on SnT2023, by:
1) Ms Anne Lycke, Norwegian Seismic Array (NORSAR)
2) Mr Anders Ringbom, Swedish Defense Research Agency (FOI)
3) Ms Dorice Seif, Tanzania Atomic Energy Commission (TAEC)
Awards Ceremony:
• European Union (EU) Star Award handed over by His Excellency Mr Stephan Klement, Head of Delegation of the European Union to International Organisations in Vienna
• Early Career Scientist award handed over by Ms Deepti Choubey, Director, Knowledge Management and Human Resources Services
• Best oral presentation handed over by Ms Zeinabou Mindaoudou Souley, Director IDC and SnT2023 Project Executive
• Best e-poster presentation handed over by Mr Uday Dayal, Director ADM
• Best e-poster presentation handed over by Ms Xyoli Perez Campos, Director IMS
• Best e-poster presentation handed over by Mr Oleg Rozhkov, Director OSI
Closing of SnT2023 by Ms Mindaoudou Souley, IDC Director and SnT2023 Project Executive
Master of Ceremonies: Mr Martin Kalinowski, SnT2023 Coordinator