In electromagnetic time-domain geophysical exploration systems, the raw data after turn-on or turn-off is averaged over a range of time. But this averaging is not uniform throughout. I understand that the time windows taken for averaging data by different systems follows geometric progression. But, I could not understand why we have to take the first start time at 0.087 ms and the first end time at 0.109 ms only. I understand that 0.100 ms is the time at which we want to take our first data and it is the average of 0.087 and 0.109. But it can be the average of two different bounds also. I have seen other systems also using similar start times.

Similar questions and discussions