14 December 2020 0 7K Report

We are using a R&S FSV3004 spectrum analyser. As many modern spectrum analyser it measures in dBm. To try an harmonic measurement of noise (noramlized per 1 Hz frequency unit) we have put a 10 kΩ resistor in front of a gain 1000 LNA. The output noise is expected to be the quadratic some of the 6 nV/√Hz of our LNA and the 13 nV/√Hz of the 10 kΩ thermal noise (√4kTR) multiply by 1000 : so about 14 µV/√Hz.

Using an old HP89410 spectrum analyser we measure 7 µV/√Hz (an half because of a 50Ω matching). So this is coherent ...

Using our new FSV3004 spectrum analyser, we are measuring -92 dBm with a 1Hz RBW (and -62 dBm with 1 kHz RBW). We try to find the √S noise spectral density : P[dBm] = 10 log Pmeasured/1mW -> taking into account the 50Ω matching, we consider the formula : √Sv = √[(10^(-92/10)*0.05] = 5.6 µV/√Hz which is about 25% less than the expected (and measured using the HP89410) value.

Is there a noise equivalent bandwidth factor than we have to consider. 25 % would be compatible to a second order filter ... but how to know what is the shape/order of the RBW filter of such equipement ? is the RBW already normalised as a top-hat ... I hope so.

What is wrong in my interpretation of this noise measurement in dBm ?

More Damien Prele's questions See All
Similar questions and discussions