AnsweredAssumed Answered

AD9371 LO Leakage

Question asked by Exray on May 15, 2017
Latest reply on May 19, 2017 by sripad

We are bringing up our bare metal implementation with ZC706 and ADRV9371 and using DDS to transmit test tones to view on Signal Analyzer.  Our LO leakage is much worse than when RadioVerse is used to transmit the test tones.

 

In our bare metal implementation, we get the least amount of LO leakage when TX_LO_LEAKAGE_INTERNAL calibration is not enabled.  That seems wrong.

uint32_t initCalMask = 0 |
TX_BB_FILTER |
ADC_TUNER |
TIA_3DB_CORNER |
DC_OFFSET |
TX_ATTENUATION_DELAY |
RX_GAIN_DELAY |
FLASH_CAL |
PATH_DELAY |
LOOPBACK_RX_LO_DELAY |
LOOPBACK_RX_RX_QEC_INIT |
// TX_LO_LEAKAGE_INTERNAL |
// TX_LO_LEAKAGE_EXTERNAL |
TX_QEC_INIT |
RX_LO_DELAY |
RX_QEC_INIT;

uint32_t trackingCalMask = 0 |
TRACK_RX1_QEC |
// TRACK_TX1_LOL |
TRACK_TX1_QEC |
TRACK_ORX1_QEC |
TRACK_ORX2_QEC;

 

With TX_LO_LEAKAGE_INTERNAL enabled:

 

 

Without TX_LO_LEAKAGE_INTERNAL disabled:

 

So I checked in RadioVerse (Simple 3MHz tone at 2500MHz LO Frequency) what the output looks like with different calibration settings enabled/disabled, and I found that when Tx1 LOL tracking Calibration is enabled, the LO leakage increases dramatically. 

TRACK_TX1_LOL Enabled results in poor LO leakage

 

Here is what the output looks like with the Tx1 LOL Tracking Calibration unchecked:

TRACK_TX1_LOL Disabled results in good LO leakage

 

Is there any reason for this?  Our bare metal implementation and Radioverse output does not seem to agree on optimal calibration settings. It looks like we're not getting the performance it is capable of in our bare metal, so we're still missing something in our recipe.

 

Thanks in advance

Outcomes