This is a question about earthquake amplitude attenuation.

With your experience of observing earthquake waveforms or simulating ones, I hope you will give us some insight on that matter.

I'm trying to simulate the PGD/PGA attenuation along a given azimuth of an earthquake magnitude 5.5, reverse fault (S:215 D:50 R:84), with point source simulation.

The process is to simulate synthetic seismograms at each stations separated with equidistance (~5km). A 1-D tabular velocity model is used.

We consider only the horizontal components (radial and transversal); and the PGD/PGA is the maximum between them.

The results I'm getting are strange for me.

The figure attached shows the PGA amplitude of 40 stations in two directions: azimuth 45° and azimuth 125°.

I was expecting a constant regular attenuation from the nearest station to the farthest one. Instead, I got two sudden decrease followed by an increase in the amplitude (around 56km and 130km for the example shown in the figure attached).

I did run several tests and this anomaly (if it is), doesn't seem to be affected by the event depth or with seismic nuting.

I would like to know if in such condition (ideal conditions without any amplification factors at the surface) we expect an amplification in amplitude (due to ordinary wave behavior such us multiple) ?

Or is it expected, especially with reverse faults? Since the chosen azimuths (45° and 125°) are ~ parallel and perpendicular to the strike?

Or is there any other explanation or error that I'm missing.

Thank you in advance for your help

Similar questions and discussions