We are bringing up our bare metal implementation with ZC706 and ADRV9371 and using DDS to transmit test tones to view on Signal Analyzer. Our LO leakage is much worse than when RadioVerse is used to transmit the test tones.
In our bare metal implementation, we get the least amount of LO leakage when TX_LO_LEAKAGE_INTERNAL calibration is not enabled. That seems wrong.
uint32_t initCalMask = 0 |
// TX_LO_LEAKAGE_INTERNAL |
// TX_LO_LEAKAGE_EXTERNAL |
uint32_t trackingCalMask = 0 |
// TRACK_TX1_LOL |
With TX_LO_LEAKAGE_INTERNAL enabled:
Without TX_LO_LEAKAGE_INTERNAL disabled:
So I checked in RadioVerse (Simple 3MHz tone at 2500MHz LO Frequency) what the output looks like with different calibration settings enabled/disabled, and I found that when Tx1 LOL tracking Calibration is enabled, the LO leakage increases dramatically.
Here is what the output looks like with the Tx1 LOL Tracking Calibration unchecked:
Is there any reason for this? Our bare metal implementation and Radioverse output does not seem to agree on optimal calibration settings. It looks like we're not getting the performance it is capable of in our bare metal, so we're still missing something in our recipe.
Thanks in advance