I am relatively inexperienced with ADCs so my apologies if this question is a bit naive but I could not find any clear answer here.
I am using an SDR receiver build on the LTC2145-14 clocked at 125 Mhz in order to measure power of weak sinusoidal signals. The analogue frontend of the ADC is quite simple, it is the one in the datasheet of the device, a 50 ohm resistor is connected to the high impedance differential A+ and A- analogue inputs and a wideband 1:1 balun connects this 50 ohm resistor to the input sma port of the receiver. No analogue op amp in the way, just a very loss loss anti-aliasing filter with a 52 Mhz cut off frequency to prevent reception of signals above the Nyquist frequency of 62.5 Mhz. With this set-up the input impedance of the receiver is a nice and fla 50 ohm from 100khz to about 50 Mhz.
To measure power, I simply convert the signal received in dBFS into actual true world dBm. Using an external rf power meter, I have determined empirically that the relationship between "dBFS" and "dBM" in my set-up is -10 dB . It does not change much across the whole dynamic range of the SDR receiver and across its frequency range either. For instance a signal generator delivering -40 dBm into my 50 ohms input impedance rf power meter will correspond to a signal reading of -50 dBFS when it is connected to my 50 ohm input impedance SDR receiver.
The trouble is that I am unable to explain this result by looking at the datasheet of the LTC2145-14!! The SENSE pin 63 of the LTC2125-14 chip is set to "0V" which corresponds to 1v peak to peak full scale, so +4 dBm into a 50 ohm load. So if 0 dBFS is corresponding to +4 dBM then according to the same logic, a signal of -40dBm should correspond to a signal of -44 dBFS but as explained above, I measure -50 dBFS and not -44 dBFS !!! There is a 6dB difference that I can not explain in anyway!
I concluded it is the SDR software that is responsible for this difference but I am not too sure really. Can someone better explain me where this 6dB difference comes from ? Is it the SDR software that I use that explains it ? Is the reasoning I am making erroneous and if yes, can someone explain me why ?
I'm not familiar with the LTC2125 and the SDR software at all, but I thought I'd bring up a couple of semi-random points I use when doing a similar exercise with a different family of ADCs. A lot of this will be common sense but we might as well cover these for completeness and as a sanity check.
You are probably beyond this but I thought I'd mention some of these generic things in case they help.
I hope your project goes well.
Thanks for your reply.
Yes I have factored in all the points except maybe your last bullet point regarding the ADC adjustment gain. As I am not familiar with ADCs I am not sure if I factored this correctly or not. As you can read below, it may (?) be the explanation
As I explained it, the ADC2145-14 and I think many others ADCs of Analogue Devices have a sense PIN that allow to select different "input range of the ADC". When this sense pin is set to ground the input range is 1vpp corresponding to 4 dBm under 50 ohm, when it is set to VCC it is 2vPP corresponding to 10 dBm under 50 ohm and that is precisely a 6dB difference....
My expectation (but I am not sure) is that if I select the 1vPP range, the noise floor of the ADC that I measure in a 500 hz bandwidth should also raise e by 6dB ?, so raise from -123 dBFS (500 Hz bandwidth) to -117dBFS (500 Hz bandwidth) but it does not happen!!! the noise floor stays the same after I select 1Vpp instead of 2Vpp..... It maybe the problem in fact ??? Maybe the device or the PCB is somehow damaged and I am actually unable to select the 1Vpp to range for some reasons.
In the ADCs I'm familiar with, increasing the reference voltage, along with allowing for a greater input amplitude, also increases the noise density. You still get improved SNR but not as much as you might hope. I don't know if this comes into play with what you are looking at.
I'll ask one of my colleagues who knows about the LTC2145 and its sense function to jump in to this conversation.
In an ideal system you are correct, dropping to the one volt range you will loose 6dB of SNR, but you are likely dominating the noise of the ADC with other noise sources so this 6dB will be reduces. What is the jitter on the clock, what is the input bandwidth you are sampling, what is the frequency of the input signal? If you have a noisy amplifier that isn't being filter you can degrade the noise floor to a point where the sense range won't matter. if you are undersampling a high frequency and you have jitter on the clock the SNR night be degraded so the sense voltage may not matter.
If you send a schematic I will have a better picture of where your noise is coming from.
I am actually using two different analogue front-ends with two LTC2145-14 circuits. Each have a different behavior when I select an input range of 1V peak to peak instead of 2V peak to peak. I am trying to understand if this difference of behavior is normal or not and from your answer I think yes but I would be grateful to you to confirm or infirm. The two circuits of the analogue front-ends are based on the ones described in figure 3 and 7 of the datasheet
In circuit based on figure 3 of the datasheet, the analogue front-end is a 1:1 ballun designed to operate between 5 and 60 Mhz, so within the first nyquist zone (the sampling rate is 125 Mhz). The noise floor measured in a 500hz bandwidth is -123 dBFS whatever the input range selected (so 1Vpp or 2vpp). There is no anti-alias filter connected to the ballun, I have just connected a 50 ohm load to it.
In the circuit based on figure 7, I am using an op-amp (LTC6403-1) to interface the ADC in a differential way instead of a 1:1 ballun. So the output feeds the ADC in a differential way
The LTC6403-1 has a voltage gain of 2 (uses two input resistorq of 402 ohm and two feedback resistorw of 804 ohm). The input of the LTC6403-1 is differential too, it is fed by a 1:16 transformer (voltage gain of 4). The input impedance of the circuit is therefore 50 ohm. The sampling rate is the same (125 Mhz),. The LTC6403-1 is equipped with an internal low pass filter that acts as an anti-alias filter.
With this second set-up when I measure the noise floor of the ADC in a bandwidth of 500hz it is -117 dBFS when I select 1V pp and -123 dBFS when I select 2V PP.
(note regarding figure 7: I am using an LTC6403-1, input is differential unlike in the picture above, and it is fed by a 1-16 rf transformer)
Why this difference between the two configs ?
The difference between the two configurations is that the transformer will not add additional noise, but the amplifier will. The LTC6403 has 2.8nV/rtHz of noise at the input, if there is gain it will be higher. Depending on the filtering you have between the amplifier and ADC all of that noise will fold into the first nyquist zone. This noise folding will degrade the SNR and cause the difference between the 2V range and 1V range to be less noticeable.
I got the point about the noise added by the amplifier and why it can make the difference between 1v and 2vpp less noticeable if the amp is very noisy and if an external noise source swamps the ADC. However this is a generic consideration... quite useful for my understanding of how ADCs works thank you...but it does not answer directly my specific question as I have no external noise source connected (in both configurations the input of the circuit is connected to a 50 ohm load, not to an antenna or another active stage like an amplifier etc...) Please also note that when I speak about noise floor, I am refering to the dBFS reading of my SDR receiver and that I measure in a 500 hz bandwidth (digital filtering)
Let me recap my conclusions and please correct me if I my interpretation of your explanation is wrong:
1) In the first config with just a 1:1 transformer and no op-amp, the noise floor in dBFS should increase by 6dB if I select 1Vpp instead of 2 Vpp (I am considering that the noise generated by the 50 ohm load is negligible).
=>The fact that I do NOT see the noise floor jumping by 6dB and that it stays at the very same level if I select 1Vpp or 2Vpp indicates a problem with the switching of the 1vPP input range that most likely does not happen.
Makes sense ?
2) In the second config with an op-amp (6dB) gain and a 1:16 transformer feeding it (12 dB gain), the 1:16 transformer does not add noise but the op-amp does. If the op-amp was ideal (no noise generated), the noise floor in dBFS should also jump by 6dB if I select 1vPP instead of 2VPP. However since the op amp is not ideal and generates noise, the noise floor will jump by MINIMUM 6dB
-> The fact that I see the noise floor jumping by 6dB indicates that the noise generated by the op-amp is negligible .That is not surprising since the power gain of the op amp is only 6dB;
3) Going back to the initial question of this threat about converting dBFS into dBm.
=> The 6dB anomaly that I pointed out at the beginning of this threat (related to the 1:1 ballun config) is most likely due to the fact that selecting 1vPP does not work (so same issue than 1 above)
Thanks for your help