Let's consider a very simple case of linear absorption. Sorry for ugly mathematical expressions.

Two counter propagating isotropic plane waves eikz and e-ikz meet in a layer of slightly absorbing medium and form a standing wave (SW). Let's assume the natural absorption coefficient alpha very small so that the SW's amplitude is ~2sin(kz). Thereby its intensity I = (2sin(kz))^2 = 2 * (1 - cos(2kz)), while intensity of each traveling wave is 1.

Now let's try to calculate absorbed power density (intensity) dP in a thin layer dz at one of the SW's maxima. It should be dP = alpha * I * dz = 4 * alpha * dz, isn't it? At the same time, each travelling wave loses alpha * dz when propagatig through dz, which is in sum 2 * alpha * dz -- two times less than dP. How to explain this difference?

Similar questions and discussions