How do we represent zero interference on db scale?
I am using the following formula for calculating the Received_Power_TH (TH stands for threshold). Received_Power_TH is the power of the received signal required for successful decoding. I am using the following formulation.
SINR=Received_Power - Interference - Noise Or
The Recieved_Power that satisfies the SINR_TH (or in other words the minimum Received_Power required for successful decoding) is given as follows
SINR_TH=Received_Power_TH - Interference - Noise Or
Received_Power_TH =SINR_TH + Interference + Noise
However, i get the following results which seem counter intuitive.
Assuming SINR=25, Interference=-50, Noise= -95
Received_Power_Th =25-50-95=-120
Now if we do not have any interference e.g. in a case when there is only one node transmitting, we get the following
Received_Power_TH=25-95=-70
This result seems counter intuitive as Received_Power_TH for successful reception in case of no interference should drop below the value we get when we have interference whereas in this case the Received_Power_TH is equal to -120 in case of interference and it increases to -70 when there is no interference.
Many thanks