We have three geo-dyanmic events that have displayed signal precursors predictive of destructive behavior. The examples include Mount St. Helens, Mount Pinatubo, and Mount Redoubt. What may distingush the "predictability" of these events from other seismic processes is the distance of the locus of the precursor signals from the surface.
Although magma chambers tend to be deep, the vents leading to the surface produce signals we can monitor in a pre-event state with greater density of instrumentation. The harmonic signals that are so clearly observed at the surface give us some early indication of a violent event. Perhaps there aren't enough clear broadband signals reaching the surface from these deep earthquake sources to provide us with the advanced warning we require.
In case you are interested, we interpreted the foreshocks sequence as the initial closure of a dilated band formed during the interseismic period above the brittle-ductile transition, a band conjugate to the main fault activated during the mainshock. These are the related articles:
I know very well that in general the problem is not trivial. My question is addressed specifically to authors that worked on L'Aquila 2009 foreshock sequence. In recent literature, for this specific case I found some references that pointed out to define it as a chaotic process, some other (e.g. Mignan, 2012) to a well organized precursory seismicity pattern due to tectonic loading of the overall area.
before discussing the seismicity pattern from a statistical point of view, which deserves a longer discussion and some refreshing in my mind, I would like to provide some observational evidence.
The L'Aquila foreshocks show a complex pattern and they are not all located on the main shock fault. After the March 30 foreshock seismicity migrated on a different fault oriented nearly NS. Seismicity returned on the main fault a few hours before the main shock.
Foreshocks are located in the same patches on the causative fault plane where aftershocks occurred. Repeaters identified within the aftershocks and few of them identified within the foreshock sequence are located in the same areas.
The main shock nucleation occurred in a area characterized by a small amount of coseismic slip and most of the energy was radiated nearly 1 s after nucleation and at a distance of roughly 2 Km (if I remember well) from the hypocenter. Within this circle of 2 km radius around the hypocenter there is negligible slip and no foreshocks or aftershocks, neither repeaters.
Rupture could not immediately propagate along strike to the SE because of a structural barrier, and it propagates only up-dip for nearly 2.5-3.5 seconds.
This demonstrates that the nucleation process and the seismicity evolution were very complex and heterogeneous. Even if we could follow such evolution in real time during the sequence, this heterogeneity would have complicated any interpretation of the seismic sequence in terms of a precursory process.
These are observations and interpretations of scientific results.
Your questions concern statistical analyses of the seismic sequence and the answer(s) can only rely on a rigorous analysis of seismicity evolution through a robust statistical approach.
I will try to comment on this interesting question in a subsequent email
There is a session on foreshocks at the AGU fall meeting this year. I will be there to hear different opinions on foreshocks. I suggest you be there too.
As Massimo says, addressing this problem is not simple or likely to produce a clear-cut answer.
Almost any earthquake sequence has unique aspects in retrospect, and these differences would be difficult if not impossible to identify in real time. Most ideas on nucleation do involve some accelerating process (including some of my own), but in practice most large earthquakes simply come out of the blue with the techniques available to us now. The L'Aquila event is unusual in that it did turn out to have a precursory sequence.
In the L'Aquila case it has been estimated that the chances of a magnitude 6 that day, using prior observation that spatio-temporal clustering and elevated event rate at this level, were around 1 in 1000 (only rarely can we get up to 1 in 100 by such statistical analysis, especially in real time - see the ICEF report at http://www.geos.ed.ac.uk/homes/imain/igmpapers/LAquila.pdf). Even then the lack of precursory strain at the surface for this event restricts the nucleation zone to less than 100 m or so, buried at a depth of around 10 km (see section on 'negative evidence as a constraint' in http://sp.lyellcollection.org/content/367/1/215.full.pdf?ijkey=Ml1zKSDqZJWOQDj&keytype=finite).
So I think it's hard to explain the complex sequence at L'Aquila as a simple accelerating nucleation process, and clearly inter-event triggering and a pre-existing near-critical stress field, operating over much longer correlation length scales as described by Massimo, played a significant role even before the main event. This kind of complexity, and the fact that we can't see beforehand which parts of the crust are susceptible to nucleation or triggering by the tiny stress perturbations involved, place strong constraints on how we use such models in operational forecasting based on spatio-temporal clustering.
We haven't looked at the L'Aquila sequence here, but pinning down the best model with an element of triggering (with a stationary or non-stationary background rate) is just a very difficult practical problem of inference, even with state of the art statistics, since the likelihood space turns out to be so flat http://www.bssaonline.org.ezproxy.is.ed.ac.uk/content/104/2/885
According to my measurement and my oppinion, the L Aquila foreshock sequence was directly connected with the beginning and main stage of so called "stress wave", which was travelling through the whole Europe from Tonga region. This stress wave triggered L Aquila foreshock sequency and deformation wave after Tonga EQ (M7.6) triggered the L Aquila mainshock a week later.
Details of my measurement and Rn anomalies (G.Giuliani) are available here: https://www.researchgate.net/publication/262009135_Testimonies_to_the_L_Aquila_earthquake_%282009%29_and_to_the_L_Aquila_process
P.S. So called "stress waves" are common before almost all of M8 EQs. See:
before giving a look to your references (internet is slow in Frankfurt airport) could you please indicate to me the time window (February or March ?) of the arrival of the "stress wave" around L'Aquila ? Which is the average velocity of the stress wave that propagated from Tonga ?. Thanks. I will give a look to your ref.
Dear Alessandro, the "stress waves" are not elastic waves, but stress waves with periods longer than 12 hours (there is a real non-elastic deformation of the rock mass). Their velocities are at maximum the same as P-waves, but in fact it depends on the stress state in the rock mass. If the fissures and faults are open (small horizontal stress) than their velocity is small and they have the big atenuation. In the case of higher stress in the crust, their attenuation is very small and they can be detect everywhere on the Earth´s surface and in the underground. The stress wave from Tonga arrived to Europe in the beginning of March (2.-6.) and their period was approx. 14 days, i.e. it has finished during one period at the time of Tonga EQ (March 19).
I put together a very hasty analysis (see figs attached) using a method i've been working with for a while. Clearly, we've a long way to go before we can start using the "p" word ("pre****tion"), but it appears that the smaller earthquakes on 5 April, and to a lesser extent the events in late March, appear to initiate a period of accelerating seismicity. Similar sequences can be observed for Chi-Chi, Parkfield, Tohoku, Sumatra, El Mayor-Cucapah, and a few others -- i'll have some revised manuscripts posted in the next week or two.
It is very easy after the earthquake occurrence to find that there was some kind of acceleration before it, especially when the earthquake is anticipated by a seismic sequence of small/intermediate shocks. From this perspective a simple answer could be: yes, L'Aquila showed some sort of seismic acceleration although it looked as evolving as a chaotic process (see for example my article:
On the other hand, when some sequence is evolving, the problem to understand what is going on is also related to which data to use (e.g. instantaneous or cumulative strain or moment release or other real or pseudo physical quantities) and how to handle them in order to eventually identify an acceleration. Literature is full of great examples, and only very few exceptions were published before the earthquake really happened.
My opinion agrees with many (e.g Massimo and Ian above) i.e. that the problem is very complex and difficult to cope with. To what previously said I would add that we really need to analyse as many kinds of data as possible in order to establish with some confidence that something unusual is happenning. (Of course it will not be only question of amount of data but also of their quality). For this reason, I would prefer to change the question to: what was the physics underlying the L'Aquila seismic sequence (or any other) that brought it to culminate with the mainshock of the 6 April, 2009?
Although still controversial, it will be of great help also to identify possible anomalies due to the eventual lithosphere-atmosphere-ionosphere coupling that seems oto provide some typical manifestations, for instance, in the form of thermal atmospheric anomalies (e.g. see my article about the 2012 Emilia earthquake:
http://www.ann-geophys.net/32/187/2014/angeo-32-187-2014.html about two Chinese earthquakes).
In particular that the problem is complex and difficult can be deduced from the results of the latter article where we show that ionospheric anomalies do not always appear, because the conditions to occur are not simple, depending on the solar magnetic activity, magnetic latitude, lithospheric conditions and, likely, kind of fault mechanism, sufficient energy of coupling, etcetera.
Although statistical analysis is important to grasp something of the question, I believe we need of something more "physics-oriented" to identify those fore-patterns that eventually preceed a large earthquake.
The problem with seismic pre-earthquake signals is that, in order to produce any reasonably detectable seismic signal, catastrophic ruptures have to take place in the crust. If we look at the extensive work by Wells and Coppersmith [Wells, D. L., and K. J. Coppersmith (1994), New Empirical Relationships among Magnitude, Rupture Length, Rupture Width, Rupture Area, and Surface Displacement, Bulletin of the Seismological Society of America, 84(4), 974-1002], we can extrapolate the size of the rupture areas necessary to create magnitude 3, 2. and 1 earthquakes. The numbers are impressive: 100,000 m2, 10,000 m2 and 1000 m2 respectively.
One has to raise the question: how much stress had to be accumulated to produce such large ruptures?
Turning this question around, we may ask: Why don't we try to detect signals produced over the wide parameter space from an average stress level in a seismically active region to the stress levels necessary to create 100,000 m2, 10,000 m2 or 1000 m2 ruptures? Obviously such signals cannot be "seismic" because nothing ruptures yet, but such non-seismic precursory signals can be crucially important to learn a thing or two about approaching major earthquakes.
One type of signals is stimulated infrared emission from the ground, known as Thermal Infrared (TIR) anomaly. It's called "thermal" because the emission takes place in the 7-14 micron region, where solid surfaces emit that are at temperatures around 300K. However, we have shown that this has nothing to do with "heat" but is due to a different form of IR emission linked to the vibrational de-excitation of excited surface states.
A first lab demonstration has appeared in the (unfortunately defunct) journal eEarth:
Freund, F. T., A. Takeuchi, B. W. S. Lau, A. Al-Manaseer, C. C. Fu, N. A. Bryant, and D. Ouzounov (2007), Stimulated thermal IR emission from rocks: Assessing a stress indicator, eEarth, 2, 1-10.
Recently we have shown that, when applied to a case like the L'Aquila earthquake, the TIR anomaly is clearly and strongly visible days before this deadly event: Piroddi, L., G. Ranieri, F. Freund, and A. Trogu (2014), Geology, tectonics and topography underlined by L'Aquila earthquake TIR precursors Geophysical Journal International, 197(3), 1532-1536.
thanks a lot for your references and analysis. I will carefully read your 2014 paper. From the seismicity pattern analysis and from GPS measurements, some researchers reported to me there is some evidence of a possible transient aseismic slip at mid February (results under review) that changed the general stress over a large area. Did you start to record TIR anomalies at mid of February ? And what about the general trend before February ?
for the TIR anomalies starting from February 2009 to start od May (and also for only October 2008) you can look at my thesis (in Italian), available at:
http://veprints.unica.it/550/
Maps from November '08 to January '09 were computed after the thesis and after the first publication on IEEE JSTARS (cited in the last article of 2014). In these maps other anomalies are present but I have not yet done the comparison with minor seismicity (M3+). They are actually unpublished.
Dr. Freund, i think, is on the right track. It is my observation that the key to detecting a precursory signal (or in general, to isolate seismicity related to a particular event -- be it in the future or the past), is in the partitioning. This is to say, we can (and should) use micro-seismicity data, TIR, slip, UAVSAR/InSAR observations, and any other data that indicates movement, fracturing, or any other type of seismic activity in the crust. The trick, then, is to invert these observations to accurately model crustal deformation (stress/strain fields, etc.) or otherwise convolve the data into a single combined signal. As Dr. Freund suggests, the results can be interpreted with respect to the spatial (and temporal) extents over which the observation is made -- an anomalous TIR measurement over a region with length L~60 km can be interpreted with respect to an (impending?) m~7 earthquake. TIR appears to be a particularly interesting observable because, at least in principle, it can be observed from remote sensing platforms.
Is L'Aquila 2009 foreshock sequence a non-critical precursory accelerating process or are foreshocks explained better by epidemic type triggering?
Ian G. Main says "... pinning down the best model with an element of triggering is just a very difficult practical problem of inference, even with state of the art statistics, since the likelihood space turns out to be so flat." Dr. Main gives lucid explanations to support his assertion.
I wish to add that there is a fundamental problem discriminating foreshocks from other earthquakes for want of clearly recognizable distinction between them. Foreshocks are generally recognized as such only after a mainshock has occurred.
Dr. Main notes that "In the L'Aquila case it has been estimated that the chances of a magnitude 6.0 that day, using prior observation that spatio-temporal clustering and elevated event rate at this level, were around 1 in 1000 (only rarely can we get up to 1 in 100 by such statistical analysis). In China where earthquake records span centuries, the odds (of a magnitude>5.0 earthquake occurring following a meaningful increase in seismic activity) is estimated to be only marginally better than 1 in 100.
Let me digress by mentioning a recent study I took part in. Launched in 1986 by the USGS, the Parkfield Earthquake Prediction Experiment (PPE) offers the best prospect to date of bagging a multitude of known precursors. The apparent absence of any reported precursors before the 2004 Parkfield earthquake (magnitude 6.0) motivated me, in 2008, to take a look at the PPE seismic data. Luckily, I didn't walk away empty-handed (Chun, Yuan, and Henderson, Bulletin of the Seismological Society of America,Vol. 100, No. 2, pp. 509-521, April 2010, doi:10.1785/0120090104). The precursor - temporal rise in rupture-zone P-wave attenuation - was expected from published laboratory studies but it had been elusive.
There is, I think, a fundamental problem with the purely seismological/geodesic approach to earthquake nucleation and to understand why some (probably pretty random) fracture event deep in the crust can grow to become a magnitude 6.3 earthquake such as was the case for the L'Aquila disaster. As Ian Main points out "the lack of precursory strain at the surface for this event restricts the nucleation zone to less than 100 m or so, buried at a depth of around 10 km". However, once the rupture started it grew to a size of about 200 square kilometer (see, for instance, Wells & Coppersmith 1994 for an estimate of rupture size as a function of magnitude). There is obviously a disconnect of some kind. It is quite inconceivable to me that a small nucleation zone of "less than 100 m or so" (probably equivalent to 100 meter-cube) can turn into a large rupture of about 200 kilometer-square, unless the stress necessary to create this large rupture had already accumulates along the much larger area/volume.
The apparent disconnect points to the possibility that strain accumulating some 10 km deep in the crust is not faithfully transmitted to the surface, where the instruments are that can measure strain. The most likely reason is that the 10 km overlying rocks do not behave as an ideal elastic body, or even close to it, but absorb the piled-up stresses in some way that is not yet understood.
This is why we need to look at non-mechanical expressions of the stresses building up deep below and turn to solid state physics to find other indicators. As Luca Piroddi has beautifully shown in this thesis under the guidance of Gaetono Ranieri, and since then published in several papers, there is anomalous excess infrared emission from the surface of the Earth around the L'Aquila region starting days, in fact weeks before the M6.3 earthquake. This infrared emission occurs in the spectral window of the thermal emission (at 300K), around 10 micron. It is therefore widely called "Thermal infrared" (TIR) and it has misled many researchers to believe that it is actual "heat". In reality there is irrefutable evidence that this so-called TIR emission is due to the radiative de-excitation of vibrationally highly excited states that form at the Earth's surface when electronic charge carriers, "positive holes", flow from the large stressed rock volume at 10 km depth to the surface and recombine. The recombination is exothermal, probably by as much as 2 eV or more, and leads the emission of bands in the 10 micron region that are spectroscopically distinct from the 300K blackbody (graybody) emission curve.
We all know that physics has long diversified into different subdisciplines, of which mechanics is one healthy example. However, when it comes to understanding the Earth, in particular something that happens under 10, 20, 30 km of rock, , it is good to remember that physics also has subdisciplines covering electricity & magnetism, semiconductors, spectroscopy, and more. I hope that our friends in seismology who have contributed so much insight through their work in mechanics would recognize the marvelous opportunities offered by "all of physics" when it comes to unravel the mysteries of the deep.
I thank you a lot for useful comments I will take into account. I am not so sure about the absence of precursory strain at surface, even if Gran Sasso strainmeters did not get any meaningful precursory signal. About phenomena at 10-20-30 km of depth and the use of electricity & magnetism, semiconductors, etc ... to investigate a larger volume I fully agree with you.
my skeptical remarks concerning strain measurements at or near the surface of the Earth were meant to indicate my general frustration with the fact that the detectable strains are either so small or so spread over a wide areas that they become very very hard to measure. Then the question arises: Why bother with strain measurements altogether, spending an inordinate amount of time and effort on them?
Of course, I'm moved to say so because I trust our capability to use other physical indicators, which are much more easily accessible, often at zero cost such as, for instance, the TIR data from MeteoSat that Luca Piroddi has been using, It all depends on understanding the underlying physical processes and how we can use them to learn something about pre-earthquake conditions that strain gauges just can't tell us.
I fully agree on opening to other observables but the attempt to investigate with more detail seismological and geodetic data cannot be considered wasting time. Lowering of the magnitude detection capability and additional research on slow slip transients could not be useful for earthquake prediciton or operational decisions to be taken during a seismic sequence, but I hope to a better understanding of the complexity in using seismology or geodesy and probably to move forward and support integrated methods.
Dear Alesssandro. I must admit that I have not studied the Aquila 2009 foreshock sequence. But in my book Advances in Earthquake Prediction, Research and Risk Mitigation, published by Springer in 2011, there is in the final chapter of the book this sentence: „The Apennine seismic belt in Italy is another tectonically active place where fluids of deep origin are present in a low permeability crust. Correlation has been found there between seismic activity and CO2 degassing from the mantle. ....... The models and methods described here (i.e. in my book) to monitor crustal processes may be applicable there also“. If you have not read my book already (see also an article published same year in BSSA by me M. Bonafede, and G.B. Gudmundsson) I think it may be helpful for you to do so. We don´t find statistical methods to predict or to forecast earthquakes useful, especially if using only the typical “bulletin information“ on seismic activity, of time, hypocenter, and magnitude. Statistical methods to find „red spots“ i. e. places to be studied better is of course relevant. But we cannot assume that any two earthquakes have the same „precursors“. Most of the big seismic precursors found afterwards with the available seismic networks with sensitivity down to magnitude 2, even 1 are shallow secondary effects of a deeper process and it is very difficult to inverse this activity to the deep process without having some other real time information from down there. Other typical precursory effects with origin in the uppermost much cracked and water saturated part of the cust are often very well seen afterwards, but we will not understand before the earthquake what they are telling us about the deeper crustal processes, unless we know what a long term process is ongoing there. In Iceland our basic tool to understand the ongoing pre-earthquake process, long and short, is a seismic system which is complete in detection down to magnitude 0, which means that we get very significant information from many earthquakes down to -0.5, not only origin and size, but semi continuous information on mechanisms and microcrack development, which helps to understand ongoing processes. The build up process for an earthquake is very slow and the signals we get directly from the nucleation zone are usually very weak compared to steadily occuring disturbances. The good thing is that it seems that the decision of nature where the next earthquake is going to happen, as well as fault size, is taken decades before, in cases that we have studied in Iceland, on faults that were last active more than 300 years ago. High pressure fluids from below penetrate into the nucleation volume of the becoming earthquake to corrode the fault “contacts“ and this strain-fluid corrosion process can be observed by microearthquake technology. In the SISZ transform zone (South Iceland, strike slip earthquakes up to 7) the brittle crust is only 10 km thick. Our monitoring and modelling implies that transform slip in the deeper part of the crust at each earthquake fault has been ongoing for very long time, but we have almost no seismicity there to tell us about this, but we know of it indirectly based on microearthquake information in the brittle crust (most intensive just at the brittle/ductile boundary). According to our experience we dont expect strain measurements by GPS to tell us anything about a short nucleation process ( I mean a few weeks process or less ) . However such monitoring together with microearthquake information is significant to study the long term crustal process leading to large earthquakes and help to create a constitutive relationship for the pre-earthquake process. The “biggest precursor“ before the initial 6.6 earthquake of year 2000 in the SISZ was from a close to borehole strainmeter and started 19 days before the earthquake. The origin of it is in the uppermost 1-2 kilometers of the crust. A very “beautiful precursor“ but we did not understand what it meant untill afterwards when we had evaluated the ongoing pre-process from the microearthquake information. The same was when water in a borehole 10 km from the epicenter sank down 5 meters a day before the earthquake. The seismologists did not get knowledge of this untill afterwards. But I would not expect it to have lead to a warning at that time even if we would have known about it earlier. It was a typical “precursor“ , i.e. afterwards it was understandable what it was saying. Both of these “precursors“ would have been very helpful to provide a useful short term warning if we would early enough have had time to analyse the information brought to the surface by microearthquakes within a two week period before the earthquake, but during that time the long time process in the ductile crust was clearly brought up to the brittle crust. I very much agree with those who say, we should analyse multiplicity of data to model the ongoing pre-earthquake process. We should use all available information to create a constitutive relationship which governs this process and extrapolate this relationsship to near future, and this can be done automatically and modelled on a very short term basis. This is a deterministic approach in earthquake prediction research. Of course we need statistics to qulify the significance of the provided warnings and that statistics can be based on the long term experience of possibly similar measured events at the fault but not leading to earthquakes. As I said our basic tool is high level micro-earthquake technology to study and map with time the ongoing process. I am sure that more approaches may be possible to study the process inside and around the nucleation core of the earthquakes, and especially below it in the ductile crust, but we have not so far tried other monitoring for that in Iceland. Unfortuneately in Iceland we have not yet developed an information and warning system that automatically brings together all evaluated results (mostly multiplicity of microearthquake information) and does basic modeling in semi real time. That is what is most urgently needed in Iceland for providing warnings ahead of earthquakes as well as ahead of volcanic eruptions. But we have already now much information to put into it on a real time basis.
I read your analysis of the seismic (foreshock) activity which - in retrospect - appears to have pointed to the approach of the L'Aquila earthquake. However, the answer would be different, if we turn this problem on its head and ask: Is there a reasonable chance to identify - during routine analysis of the seismicity of, say, whole Italy - any local conditions pointing to impending seismic activity? Rephrased differently, I'm wondering whether any seismic pre-earthquake pattern can be expected to stand out sufficiently clearly from the overall seismic chatter so that danger can be recognized?
From my distinctly non-seismologist perspective and from having read quite a few paper on post-event seismological analyses, I have doubts that seismology can "pull off" a purely seismological alert system. The odds seem to be clearly stacked against the reliability of purely mechanical (seismological and geodesic) processes as a tool to recognize the build-up of monstrous stresses some 10-35 km below the surface of the Earth. That is why I dearly hope that the community would start accepting the notion that earthquake science is more than mechanical physics. Together we can do more.
your opinions are welcome. Together, we could try to check what happened at mid February 2009 (starting 12 February 2009) in L'Aquila region. Foreshocks, if detected, could be or not be only a second order effect of something monstrous at depth. However, did Luca's work detect something that could be linked with a change in the low-level seismicity at mid February? Couldn't it be that the hidden monster (shown by non mechanical methods) is linked directly or indirectly with small seismic perturbations at surface ? I think that an effort could be made about this specific case even if you do not believe in the mechanical approach.