A antenna with 50dB gain transmits 0dBm power and the receiver antenna(gain 0db) at 10m distance receives 10dBm power. How does this happen? Path loss for 10m is taken as -40dBm. frequncy of operation 2400MHz. This violates the law of conservation of energy. anything to prove this?

More Amit Narkhede's questions See All
Similar questions and discussions