I am hoping for some advice on measuring the mV/dB slope of the AD8307 in our application . We use it in a solar radio spectrometer, which receives solar emissions at 45 to 870 MHz and down-converts thru two conversion stages to 10.7 MHz IF. The IF noise bandwidth is about 280 kHz and feeds the AD8307. The output of the AD8307 is measured by an ADC. The digitized voltages are collected by software that controls the receiver. The AD8307 slope is not adjusted.
We use a 15 dB ENR (9460 K equivalent noise temperature) noise source to measure the receiver noise figure, which is between 7 and 8 dB. We also are trying to measure the slope of the AD8307 with a similar setup as follows:
1. Inject noise, sweep receiver and collect ADC values at each frequency
2. Insert 10 dB attenuator between noise source and receiver, sweep receiver again and collect new ADC values at each frequency
See attached for the noise calculations for our setup.
The above has not worked because the receiver noise temperature (about 1537 K) is higher than the attenuated noise temperature from the noise source (1207 K). Depending on how we set this up, the measured slope is anywhere from 5 mV/dB to 30 mV/dB. In other words, wrong measurement because of wrong setup.
We tried using a low noise amplifier (~1 or 2 dB noise figure, 20 dB gain) in front of the receiver to amplify the noise source and lower the noise figure of the system. However, our measurements show the slope of the AD8307 averages about 20 mV/dB across the frequency band, which we know is incorrect and which we believe, again, is due to our setup. I think we are overlooking something obvious but are out of ideas.
Can anyone suggest how to make in-circuit measurements of the slope in this application?