Speaker
Description
Due to the large amount of information that is created with a new event and its relocation, the SeisComP3 (SC3) Database size tends to increase and the backup process becomes increasingly time-consuming.
At CATAC (Central American Tsunami Advisory Center) we created a data import mechanism, to send the information referred of every event from the main database to three other real-time servers, which represent the front-end to disseminate products such as Hypocenter location, Focal Mechanism and Moment Tensor in the webpage and two SC-Processing Backups.
We also implement this procedure to import solutions from the Earthquake Early Warning (EEW) Server to the main Server.
Thanks to these good practices, we keep data protected and available into the main database, servers, and backups avoiding unnecessary traffic that overloads the network.
In every server, queries are made directly at the localhost level so there’s no network traffic and the main server is not saturated at all. Additionally, in case of IT chaos in our datacenter provoked by a natural disaster or another, there will be no data loss.
Promotional text
Methodology of Good Practices in SeisComP3 Databases Management at CATAC to Guarantee Issuance of Earthquakes Products in Real Time for Making-decision, Hazard Assessment and Risk Mitigation.