Converting dBFS to dBm with an SDR receiver based on the LTC2145-14 clocked at 125 Mhz

Hello,

I am relatively inexperienced with ADCs so my apologies if this question is a bit naive but I could not find any clear answer here.

I am using an SDR receiver build on the LTC2145-14  clocked at 125 Mhz  in order to measure power of weak sinusoidal signals. The analogue frontend of the ADC is quite simple, it is the one in the datasheet of the device, a 50 ohm resistor is connected to the high impedance differential A+ and A- analogue inputs and a wideband 1:1 balun connects this 50 ohm resistor to the input sma port of the receiver. No analogue op amp in the way, just a very loss loss anti-aliasing filter  with a 52 Mhz cut off frequency to  prevent reception of signals above the Nyquist frequency of 62.5 Mhz. With this set-up the input impedance of the receiver is a nice and fla 50 ohm from 100khz to about 50 Mhz. 

To measure power, I simply convert the signal received in dBFS into actual true world dBm.  Using an external rf power meter, I have determined empirically that the relationship between "dBFS" and "dBM" in my set-up is -10 dB . It does not change  much across the whole dynamic range of the SDR receiver and across its frequency range either. For instance a signal generator delivering  -40 dBm into my 50 ohms input impedance rf power meter will correspond to a signal reading of -50 dBFS when it is connected to my 50 ohm input impedance SDR receiver. 

The trouble is that I am unable to explain this result by looking at the datasheet of the LTC2145-14!!   The SENSE pin 63 of the LTC2125-14 chip is set to "0V" which corresponds to 1v peak to peak full scale, so +4 dBm into a 50 ohm load. So if 0 dBFS is corresponding to +4 dBM then according to the same logic,  a signal of -40dBm should correspond to a signal of -44 dBFS but as explained above, I measure -50 dBFS and not -44 dBFS !!! There is a 6dB difference that I can not explain in anyway!

I concluded it is the SDR software that is responsible for this difference but I am not too sure really.  Can someone better explain me where this 6dB difference comes from ? Is it the SDR software that I use that explains it ?  Is the reasoning I am making erroneous and if yes, can someone explain me why ?

Thanks
Regards
Peter 



correcting spelling mistake in the title
[edited by: on7yi at 2:49 PM (GMT 0) on 4 Jun 2019]
Parents
  • 0
    •  Analog Employees 
    on Jun 11, 2019 11:47 PM over 1 year ago

    Hi Peter,

    I'm not familiar with the LTC2125 and the SDR software at all, but I thought I'd bring up a couple of semi-random points I use when doing a similar exercise with a different family of ADCs. A lot of this will be common sense but we might as well cover these for completeness and as a sanity check.

    • dBm is a unit of power. In calculating power from voltage and resistance, we need to be sure to use the rms value of the voltage. dBFS is also a relative power measurement, but it is with respect to the full-scale voltage of the ADC. The voltage range of ADCs would be a value that is compared to the peak-to-peak value of an AC signal. So, when converting from dBm to dBFS you need to keep the conversion of rms to peak-to-peak in mind.
    • Calculation of voltage from power requires a known resistance value. Make sure in your different scenarios that you are representing the load resistances properly. If the input impedance of the ADC is high, the load resistance will be determined by what is on the board.
    • There will be an insertion loss associated with your balun. Might this account for some of the discrepancy?
    • How does your power meter work? Does it introduce any additional load that might produce unexpected results?
    • Is there a gain adjustment in the ADC that might be affecting your results?

    You are probably beyond this but I thought I'd mention some of these generic things in case they help.

    I hope your project goes well.

    Doug

  • Hi Doug,

    Thanks for your reply.

    Yes I have factored in all the points except maybe your last bullet point regarding the ADC adjustment gain. As I am not familiar with ADCs I am not sure if I factored this correctly or not. As you  can read below, it may (?) be the explanation

    As I explained it, the ADC2145-14 and I think many  others ADCs of Analogue Devices have a sense PIN that allow to select different "input range of the ADC". When this sense pin is set to ground the input range is 1vpp corresponding to 4 dBm under 50 ohm, when it is set to VCC it is 2vPP corresponding to 10 dBm  under 50 ohm and that is precisely a 6dB difference.... 

    My expectation (but I am not sure) is that if I select the 1vPP range, the noise floor of the ADC that I measure in a 500 hz bandwidth should also raise e by 6dB ?, so raise from -123 dBFS (500 Hz bandwidth) to -117dBFS (500 Hz bandwidth) but it does not happen!!!  the noise floor stays the same after I select 1Vpp instead of 2Vpp..... It maybe the problem in fact ??? Maybe the device or the PCB is somehow damaged and I am actually unable to select the 1Vpp to range for some reasons.

    Regards

    Peter

  • Hello Clarence,

    I am actually using two different analogue front-ends with  two LTC2145-14 circuits. Each have a different behavior when I select an input range of 1V peak to peak instead of 2V peak to peak. I am trying to understand if this difference of behavior is normal or not and from your answer I think yes but I would be grateful to you to confirm or infirm. The two circuits of the analogue front-ends are based on the ones described in figure 3 and 7 of the datasheet

    In circuit based on figure 3 of the datasheet, the analogue front-end is a 1:1 ballun designed to operate between 5 and 60 Mhz, so within the first nyquist zone (the sampling rate is 125 Mhz). The noise floor measured in a 500hz bandwidth is  -123 dBFS whatever the input range selected (so 1Vpp or 2vpp). There is no anti-alias filter connected to the ballun, I have just connected a 50 ohm load to it.

    In the circuit based on figure 7, I am using an op-amp (LTC6403-1) to interface the ADC in a differential way instead of a 1:1 ballun. So the output feeds the ADC in a differential way

    www.analog.com/.../64031fb.pdf

    The LTC6403-1 has a voltage gain of 2 (uses two input resistorq of 402 ohm and two feedback resistorw of 804 ohm). The input of the LTC6403-1  is differential too, it is fed by a 1:16 transformer (voltage gain of 4). The input impedance of the circuit is therefore 50 ohm. The sampling rate is the same (125 Mhz),. The LTC6403-1 is equipped with an internal low pass filter that acts as an anti-alias filter.

    With this second set-up when I measure the noise floor of the ADC in a bandwidth of 500hz  it is -117 dBFS when I select 1V pp and -123 dBFS when I select 2V PP.

    (note regarding figure 7: I am using an LTC6403-1, input is differential unlike in the picture above, and it is fed by a 1-16 rf transformer)

    Why this difference between the two configs ?

    Regards

    Peter

  • 0
    •  Analog Employees 
    on Jun 17, 2019 5:35 PM over 1 year ago in reply to on7yi

    The difference between the two configurations is that the transformer will not add additional noise, but the amplifier will.  The LTC6403 has 2.8nV/rtHz of noise at the input, if there is gain it will be higher.  Depending on the filtering you have between the amplifier and ADC all of that noise will fold into the first nyquist zone.  This noise folding will degrade the SNR and cause the difference between the 2V range and 1V range to be less noticeable. 

  • Thanks Clarence,

    I got the point about the noise added by the amplifier and why it can make the difference between 1v and 2vpp less noticeable if the amp is very noisy and if an external noise source swamps the ADC. However this is a generic consideration... quite useful for my understanding of how ADCs works thank you...but it does not answer directly my specific question as I have no external noise source connected (in both configurations the input of the circuit is connected to a 50 ohm load, not to an antenna or another active stage like an amplifier etc...)   Please also note that when I speak about noise floor, I am refering to the dBFS reading of my SDR receiver and that I measure in a 500 hz bandwidth (digital filtering) 

    Let me recap my conclusions and please correct me if I my interpretation of your explanation is wrong:

    1) In the first config with just a 1:1 transformer and no op-amp, the noise floor in dBFS should increase by 6dB if I select 1Vpp instead of 2 Vpp  (I am considering that the noise generated by the 50 ohm load is negligible).

    =>The fact that I do NOT see the noise floor jumping by 6dB and that it stays at the very same level  if I select 1Vpp or 2Vpp  indicates a problem with the switching of the 1vPP input range that most likely does not happen.

     

    Makes sense ?

    2) In the second config with an op-amp (6dB) gain and a 1:16 transformer feeding it (12 dB gain), the 1:16 transformer does not add noise but the op-amp does. If the op-amp was ideal (no noise generated),  the noise floor in dBFS should also jump by 6dB if I select 1vPP instead of 2VPP. However since the op amp is not ideal and generates noise, the noise floor will jump by MINIMUM 6dB

    -> The fact that I see the noise floor jumping by 6dB indicates that the noise generated by the op-amp is negligible .That is not surprising since the power gain of the op amp is only 6dB;

    Makes sense ?

    3) Going back to the initial question of this threat about converting dBFS into dBm. 

    => The 6dB anomaly that I pointed out at the beginning of this threat (related  to the 1:1 ballun config) is most likely due to the fact that selecting 1vPP does not work  (so same issue than 1 above)

    Makes sense ?

    Thanks for your help

    Regards

    Peter

     

  • 0
    •  Analog Employees 
    on Jun 18, 2019 5:18 PM over 1 year ago in reply to on7yi

    Can you send raw data? You are correct if you only have a 50ohm source connected at you change from the 2V range to the 1V range then you will see the noise floor jump. 

    Are you using our demo board or is this on your own  board?  Can you send a schematic and layout?

  • The two boards are red pitaya 125-14 models. I have modified the analogue front-end as described above

    - 1:1 ballun for 1 board and,

    -  the other board is using an LTC6403-1 op-amp fed in a differential way by a 1:16 transformer.

    For the analogue front-end there is no diagram other than the ones I provided you already, so you have all info.  For the rest, the only info publicly available is this one :


    https://dl.dropboxusercontent.com/s/jkdy0p05a2vfcba/Red_Pitaya_Schematics_v1.0.1.pdf

    For the raw data samples, I can do it, I just need a nit of time.

    Anyway, now that I know how it should work I will inspect the PCB for a possible fault preventing to select the 1Vpp range on 1;1 ballun board

    regards

    Peter

Reply Children
  • 0
    •  Analog Employees 
    on Jun 20, 2019 5:28 PM over 1 year ago in reply to on7yi

    From the schematic it looks like you are operating in the 1V range:

  • Yes, that is true, this is the default setting but I modified it.... As I said, there is something wrong with the board with the 1:1 balun, whatever the voltage of the sense Pin, the noise floor stays at -123 dBFS corresponding to -113 dBM, setting the sense PIn to 0 or 1 does not change the noise floor in dBFS. Now I understand that it is not normal and will check what is the problem.

    On the other board everything works as expected

  • Ok  I have checked the PCB of the board with the 1:1 balun, there was a fault,  the sense pin was stuck on 0V. Now when I set the sense pinis set  to +VCC the noise floor decreases by 6dB which is exactly what is expected.

    I have made some measurements and my conclusion is that there is not much benefit to set the sense pin to 0V (1v pp) instead +Vcc (2v pp) . The measured noise floor in dBFS is 6dB higher when the sense pin is set to 1vPP (ground) but the noise floor in dBm is exactly the same if I chose 1vPP or 2vPP !!  No changes !!! (remember there is no external noise , the input is connected to 50 ohm load). As a matter of fact using a range of 1vPP instead of 2Vpp only results in a loss of 6dB dynamic range ! everything else seems to be the same!! So far this application (SDR receiver), I do not see the point.

    I also compared with the other board, the op-amp and the transformer have a theoretical gain of 18 dB (6dB for the op-amp and 12dB for the transformer),  the noise floor in dBm drops from -113.5 dBm to -125 dBm, so that is about 11.5dB better. That is logical as it is corresponding to the noise free gain of the rf-transformer (12dB). The SNR ratio increases by the same amount of 11.5dB course since the transformer is noise free.

    The LTC6403-1 in the second configuration does not improve the signal to noise ratio as it raises the noise floor by about 5.5 dB for a gain of 6dB, however its value is in allowing to provide an optimal input impedance of 804 ohms to the 1:4 rf transformer, so the input impedance is flat close to 50 ohms over the entire first Nyquist zone  

    Thanks for your help, things are conceptually clear  by now

  • There is just one mystery that remains unsolved. 

    For both boards,independently of the way I set  of the sense pin,  the converting factor  between dBFS to dBm is 6dB too low

    For instance the board with the 1:1 balun when the sense pin is set to ground (1vPP so 4 dBm into 50 ohms), I measure a converting factor of -10, so for instance  -20dBm  at the input of the ADC  corresponds to a reading of -30 dBFS with my receiver software. This makes  full scale (so  0 dBFS reading my receiver)  coresspond to +10 dBm . The problem is that 1 V peak to peak into 50 ohm is 4 dBm and not 10 dBm so either the receiver software is playing some tricks and a reading of 0dBFS on my receiver does not correspond to full scale or 4 dBm sor there is a  1/2 resistive divider at the rf input of the chip that is not documented  in the data sheet.

    I do not think the receiver software is playing tricks because  I measure with my receiver software a -1 dB compression point (at 10 Mhz) around -3 dBFS which seems to me totally normal if  0dBFS is corresponding to ADC  full scale. That leaves only the possibility for some strange behavior in the chip itself or some misinterpretations of the LT2145-14 specs on my part.

    This begahvor does not change if I set the ref voltage to 2vPP instead of 1Vpp or if I use the other board with an op-amp and a 1:16 rf transformer. The converting factor dBFS to dBm is always 6dB too low and I can not explain it as all input impedance involved are strictly 50 ohms

    I am really curious to understand the reason for this behavior and would appreciate your insight on this.

    Regards

    Peter

  • 0
    •  Analog Employees 
    on Jul 8, 2019 9:00 PM over 1 year ago in reply to on7yi

    The benefit of using the 1V range is SFDR if you are using an anemic driver that can't drive 2Vpp without distorting.