Post Go back to editing

Very noisy signal at TX output

Hi AD support,

I'm trying to have some preliminary DPD tests with ADS9-V2EBZ + EVAL-ADRV9029-HB on TES-GUI-6.3.0.5. I'm using profile 50 Link sharing, Tx input rate 122.88MHz, and thus 1228800 samples for 10ms signal input. However, while connecting my spectrum analyzer right to the Tx output, it shows a very noisy constellation:

This is my original LTE signal:


And this is the signal shown in TES GUI, seems to be clean and no clipping:

After some initial debug efforts, the IQ offset shown in VSA is very high, which could be related to carrier feedthrough. Therefore, I suspect that I'm missing some configurations or RF chain calibrations.

My signal can be found attached: modulated_readback_Tx1.txt

My VSA setup can be found attached: LTE_DL_20MHz_DEMOD_ConstellationView3.zip



fix typos, add VSA setx
[edited by: MasonH at 2:00 AM (GMT -4) on 29 Apr 2022]
  • The Reference clock does not seem to be clean enough, spurs are observed and so the Tx output.. The requirements for the Ref. clock are explained in the UG, Page 73.

    I believe you are providing the reference clock from an RF signal generator, if not can you try with that?

    Are you testing on the HB variant of the board, the Tx output power seem to be less? Using any external attenuator or Internal Tx attenuation of the chip?

    Page 264 of the UG explains how to run the python scripts. Go to IronPython tab in the GUI, load the script and run it.

  • Yes the clock is from a RF signal generator. I will look for the requirements.

    I’m using HB variation, with no attenuator except -10 dB attenuation while generating tonetest.

    i understand how to run Python code, but i have not understood how to run C code APi.

  • Are you using your custom board or EVB? If you are using EVB, you can directly run the script that i shared earlier following the UG.

    You need to develop C code API and try testing it.

  • Hi Ramarao,

    Today I conducted 3 tests on this problem:

    1. Tone test on Tx port. Before that, I changed my clock source to Keysight N5182B and checked it phase noise with 500Hz and 2MHz span, which show at least 80dB difference:

    However, the tone test that I have from the Tx port is worse. I checked the tone with 500Hz and 2MHz span, which shows only difference at 35dB:

    2. QPSK single carrier test on Tx port. Instead of OFDM signal, I tried a simpler single carrier test. The constellation of my generated signal is very clean:

    However, the signal from Tx port is still noisy in circular pattern:

    3. QPSK single carrier test on Rx port, loop back to Tx. While trying loopback test Rx-Tx as you suggested, the same problem still persists:

    From all those tests and observations, it seems to me that there is phase noise in my testbed. However, as I checked it is not from my clock source. Therefore, I conclude that it is from either PLL or LO. I didn't try the API using C to check PLL locks, as I don't understand how to, the UG is not very clear about it.

    What do you think I should check next? Should I start the board replacement process?

  • My understanding is that you are using EVB to do all these measurements. You can directly run these scripts in the 'ironpython' tab of the GUI. Below are the scripts to check the PLL loop bandwidth and PLL lock status of all LOs. Check if all the LO supplies are as per the specification once.

    PLL_Scripts.zip

  • Yes, it is an EVB-HB. Thanks, I will try to run those and keep you updated.

  • Hi Ramarao,

    This is what I get from the script:

    Connected
    Aux PLL locked
    LO2 PLL locked
    LO1 PLL unlocked
    CLK PLL locked

    Connected
    LO2 FREQUENCY = <adrv9010_dll.Types.adi_adrv9010_PllLoopFilterCfg_t object at 0x000000000000002B [adrv9010_dll.Types.adi_adrv9010_PllLoopFilterCfg_t]> HZ loop filter BW = 100 Phase margin = 60 Power scale = 10
    (0, <adrv9010_dll.Types.adi_adrv9010_PllLoopFilterCfg_t object at 0x000000000000002B [adrv9010_dll.Types.adi_adrv9010_PllLoopFilterCfg_t]>)

    The file "PLL_Loop_filter_Status" output seems to be buggy, so I added another print(z) command. Is 0 the correct return value?

    --------

    One thing I noticed is that both your files and the files created with "New script" is importing from adrv9010_dll, while in the UG 1727 page 265, it imports from adrv9025_dll. I tried loading from adrv9025_dll but it cannot find that one. Am I using an obsolete version? I'm using version 6.3.0.5 in specific.

    Also, I added another 2 lines:

    lo2 = link.platform.board.Adrv9010Device.RadioCtrl.PllFrequencyGet(Types.adi_adrv9010_PllName_e.ADI_ADRV9010_LO2_PLL, 0)
    print "LO2 set to :" + str(lo2[1])

    which show the frequency to be 3.5Ghz, as I set

    I tried playing around with PLL loop filter, from 50kHz to 200kHz, and it changes the generated tone a bit as in the following figure: (lowest to highest: 50kHz to 200kHz)

    I can see some improvement on the tone, however it's still not good enough for single carrier QPSK test.

    I also include my clock source signal, which shows that there is around 60dB difference between my fundamental and 1st order harmonic, which seems to be good in my opinion.

  • I guess you are using LO2, can you check with LO1 once ?

    Is this issue observed on all the channels? Can you test with a different UC?

  • The same result with LO1, and the same result over all channels. What do you mean by different UC?

  • After replacing ADP5056, the problem vanished.