Hi,

I'm analyzing a system including a variable gain amplifier, AD8367. To characterize the system, I'm interested in the total output RMS voltage noise, which is equal to the standart deviation of the samples taken by an ADC/oscilloscope.

The only noise quantity given in the datasheet is the noise figure NF. So I'm interested how this quantity is determined, and how the NF can be transferred to the output RMS voltage noise?

The VGA input impedance is 200 Ohm. Is the VGA input tied to ground and the output noise measured as a function of the gain? Where the input noise is the noise of the 200 Ohm input resistor, leading to F = N_out/(Gain * N_in)? Where N_in = sqrt(k * T * B * 200 Ohm * Bandwidth).

Or is the input terminated by the test setup shown in Fig. 45, and its input is matched by a 50 Ohm resistor, leading to a total 50 Ohm input resistor, and (by connecting the output matching network to a 50 Ohm load) an additive 50 Ohm resistor noise at the output?

I'm really confused how to get the RMS voltage noise from a given noise figure, maybe someone can give me a hint.

BR

NF is referenced to 200 Ohms. This is shown in the datasheet on the captions of Figures 6 & 7. It looks like your noise factor equation is correct, which you can then calculate the output voltage noise density from.