With the analog inputs to the ADC shorted to AC ground, I'm seeing a very low level spur at 0 Hz ( about -120 dBFS) when I look at the FFT of the baseband I/Q from the DDC output.
I suspect the spur is caused by the NCO leaking back into the ADC analog input or perhaps there is some kind of IQ imbalance due to bit truncation in the NCO's cos/sin lookup table or the digital multiplier in the NCO.
Clock In 800 MHz, input clock divider 2, ADC core sample rate 400 MHz
Analog Input 325 MHz, DDC0 used, NCO set to 75 MHz
Changing the input clock divider phase offset to 0.5 or 1.5 cycles reduces the spur on some devices, but not others.
Does anyone know why the input clock divider phase offset would affect the spur level? Is there a defined phase relationship between the NCO clock and sample clock?