Post Go back to editing

Very noisy signal at TX output

Hi AD support,

I'm trying to have some preliminary DPD tests with ADS9-V2EBZ + EVAL-ADRV9029-HB on TES-GUI-6.3.0.5. I'm using profile 50 Link sharing, Tx input rate 122.88MHz, and thus 1228800 samples for 10ms signal input. However, while connecting my spectrum analyzer right to the Tx output, it shows a very noisy constellation:

This is my original LTE signal:


And this is the signal shown in TES GUI, seems to be clean and no clipping:

After some initial debug efforts, the IQ offset shown in VSA is very high, which could be related to carrier feedthrough. Therefore, I suspect that I'm missing some configurations or RF chain calibrations.

My signal can be found attached: modulated_readback_Tx1.txt

My VSA setup can be found attached: LTE_DL_20MHz_DEMOD_ConstellationView3.zip



fix typos, add VSA setx
[edited by: MasonH at 2:00 AM (GMT -4) on 29 Apr 2022]
Parents
  • The EVM plot above you shown is at Tx. out of the chip or at the PA output? At what output power are you measuring the EVM? If you are measuring at the PA out, how much ACLR are you getting after DPD?

    If the ACLR is good, try to test at 3dB backoff power and check the EVM.

    Also, share us the DPD statistics for further analysis.

    Can you measure the EVM, ACLR at the TX out of the chip as well if not done already? 

    On the calibrations perspective, can you check if you have enabled LO leakage init calibration? Can you check the LO leakage performance, by sending a tone at 10MHz offset?

  • Hi Ramarao,

     I’m measuring output right at ADRV9029 Tx1 port, not involving any PA or DPD. The channel power is -10 dBm, PAPR 11.3 dB. ACPR is mediocre, at -44dBC as in the images.

    In my setup, I just use Tx1. Does it affect the calibration? As I remember, LO leakage calibration is enabled by default. I will check again to ensure.

  •  I’m measuring output right at ADRV9029 Tx1 port, not involving any PA or DPD. The channel power is -35 dBm, PAPR 11.3 dB. ACPR is mediocre, at -44dBC as in the images.

    Can you check the ACPR and EVM at around -10dBm or so?

    We can get an ACPR as above at the specified max. output power. Can you check with a different waveform? For the LO leakage tracking cal, Tx should be fed back to ORX. To rule out the LO leakage dependency, send the signal at some offset from LO where LO does not fall in the output and check the EVM.

  • My bad, the TX signal is -10dBm in power and 11.3dB in PAPR.

    Currently, I’m testing 3.5 Ghz signal so I set the LO2 PLL to 3.5 Mhz, and the NCO mixer to 0. Did you mean to change LO to another frequency and set the mixer to match 3.5Ghz?

Reply Children
  • Please input the signal waveform that has a shift in the baseband itself. No need to change the NCO mixer.

  • Hi Ramarao,

    This is 2 tone test at +- 10MHz. Seems OK, the position is correct, and there is a small component in the middle.

     

    Also, I tried the 20Mhz signal at LTE waveform files for AD9371/5 (122.88MSPS) - Documents - Design Support AD9371/AD9375 - EngineerZone (analog.com) . Still no thing difference from previous results in term of EVM and constellation form. However, as I checked again, the ACPR is actually good. VSA may not show the full range, but from my eyes it should be around 60 dBc with averaging.

  • In the EVM plot, i can observe OV1 error which is overload indication, can you increase the attenuation of the spectrum analyser and check?

    Also, the frequency error is in terms of 4kHz which seems to be incorrect.

    As you are measuring DL,you dont need any sync for the Soectrum analyzer as well.

    How did you measure the EVM of your original signal and at what point?

    Hope the reference clock, LO are locked. Please share more details of your test setup like which UC, waveform etc.

  • The OV1 is shown while I play with the attenuation, normally it is not there.

    From my original post, there are 2 VSA captures:
    1. The noisy version is from VSA live capture from ADRV9029 TX port.
    2. The clean version is from VSA "read recording", captured back from ADRV9029 GUI using the "Save" button on "Tx Tab".

    It seems to me that my signal is correct, but the signal after DAC seems to be noisy. I followed very carefully the procedure, so I think it can be due to the clock setup. Just to keep you updated, I set up the clock to 122.88 Mhz and 7dBm sine wave, using 2 different signal generators but so far the same results.

    Also, where is the LO lock indicator on the board or GUI, I want to double check. I am not sure if it's locked or not.

    I'm using the signal provided at LTE waveform files for AD9371/5 (122.88MSPS) - Documents - Design Support AD9371/AD9375 - EngineerZone (analog.com) for testing. For my signal, I'm using SystemVue to generate 64QAM, 100 RB, 15k sub-carrier spacing, 2048 FFT and 4 times oversampling to match 122.88MHz sampling rate.

    Also, if I want to input my clock into EXT_LO1 and EXT_LO2 to replace the internal LO, is it possible? From the datasheet: "Differential External LO Input/Output 1. If used for the external LO input, the input frequency must be 2× the desired carrier frequency. Do not connect if unused. External LO functionality not currently supported."

  • Please check the PLL lock status by following the below procedure

     Monitoring PLL lock status on ADRV9026 

    Can you test with Rx to Tx loopback using the attached script?

    Provide the required signal from the signal generator on the Rx input (-30dBm) and check the Tx output on the spectrum analyser and while measuring EVM, use IQ swap.

    1423.ADRV9025_LoopRxDataToTx_Enableset.zip

    Ext. LO option is not supported.

    Can you check at lesser output power, may be at 6dB backoff power from what you are measuring right now.

    Can you enable all Init cals and Tracking cals (TX QEC and LO Leakage) and check once if not done already?

  • Can you check at lesser output power, may be at 6dB backoff power from what you are measuring right now.

    I tried 10dB additional back off, still the same result. It's clear that I don't saturate the DAC according to the tracking on Tx tab.

    Can you enable all Init cals and Tracking cals (TX QEC and LO Leakage) and check once if not done already?

    I enabled them all in everything measurements, except "External Path Delay", "Rx Gain Delay/Phase" and "Tx Attenuation Delay/Table".

    Can you test with Rx to Tx loopback using the attached script?

    It not clear to me how to run this script. Is it correct to follow this procedure: connect signal generator to Rx1, connect spectral analyzer to Tx1, enable the signal generator, run the python script in TES-Iron Python?

    Please check the PLL lock status by following the below procedure

    I read several posts and answers but it is still not clear to me how to run the C API to check the PLL. Can you please link a tutorial, or is there a version that can be run using TES-Iron Python?

    Meanwhile, I recheck the tone test from Tx output and compare it with one of my clock generator. This is the tone from clock generator, which shows 60dB difference to second peaks within 2kHz span

    This is signal from Tx port, which shows 30 dB to second peaks:

    Is it the normal value? I don't see it in the datasheet. I also add measurement at 2Mhz span.

  • The Reference clock does not seem to be clean enough, spurs are observed and so the Tx output.. The requirements for the Ref. clock are explained in the UG, Page 73.

    I believe you are providing the reference clock from an RF signal generator, if not can you try with that?

    Are you testing on the HB variant of the board, the Tx output power seem to be less? Using any external attenuator or Internal Tx attenuation of the chip?

    Page 264 of the UG explains how to run the python scripts. Go to IronPython tab in the GUI, load the script and run it.

  • Yes the clock is from a RF signal generator. I will look for the requirements.

    I’m using HB variation, with no attenuator except -10 dB attenuation while generating tonetest.

    i understand how to run Python code, but i have not understood how to run C code APi.

  • Are you using your custom board or EVB? If you are using EVB, you can directly run the script that i shared earlier following the UG.

    You need to develop C code API and try testing it.

  • Hi Ramarao,

    Today I conducted 3 tests on this problem:

    1. Tone test on Tx port. Before that, I changed my clock source to Keysight N5182B and checked it phase noise with 500Hz and 2MHz span, which show at least 80dB difference:

    However, the tone test that I have from the Tx port is worse. I checked the tone with 500Hz and 2MHz span, which shows only difference at 35dB:

    2. QPSK single carrier test on Tx port. Instead of OFDM signal, I tried a simpler single carrier test. The constellation of my generated signal is very clean:

    However, the signal from Tx port is still noisy in circular pattern:

    3. QPSK single carrier test on Rx port, loop back to Tx. While trying loopback test Rx-Tx as you suggested, the same problem still persists:

    From all those tests and observations, it seems to me that there is phase noise in my testbed. However, as I checked it is not from my clock source. Therefore, I conclude that it is from either PLL or LO. I didn't try the API using C to check PLL locks, as I don't understand how to, the UG is not very clear about it.

    What do you think I should check next? Should I start the board replacement process?