I am working in statistical seismology and we are running into a HIGHLY controversial topic. What can we say about the largest possible event (earthquake) that could happen in an area based on data? We make estimates, but what reliability do these estimates carry? There are epistemic and random uncertainties involved. There are many theoretical estimators for this quantity but many scientist doubt that they are of any practical value. I do not believe we seismologists are qualified to do more than "rambling" about the problem and I think some input from philosophers would be extremely enlightening.

I refer to papers:

Pisarenko VF (1991). Statistical evaluation of maximum possible magnitude. Izvestiya Earth Phys 27:757–763

Zöller, G. & Holschneider, M. (2016). The Maximum Possible and the Maximum Expected

Earthquake Magnitude for Production-Induced Earthquakes at the Gas Field in Groningen, The

Netherlands. Bull. Seismol. Soc. Am. 106, 2917-2921.

Zöller, G. (2017) Comment on “Estimation of Earthquake Hazard Parameters from Incomplete Data

Files. Part III. Incorporation of Uncertainty of Earthquake‐ Occurrence Model” by Andrzej

Kijko, Ansie Smit, and Markvard A. Sellevoll. Bull. Seismol. Soc. Am. 107: 1975-1978.

More Petrus Johannes Vermeulen's questions See All
Similar questions and discussions