I am capturing data from the receive channel of FMCOMMS2, perform FFT and send it over the GbE and finally plot it on the PC. When I set the rx_lo_freq to and input a pure sinusoidal signal from the signal generator. Below figures is an example output where the rx_lo_freq is set to 2000MHz and different values of sinusoids are input. (Here the signal is the leakage signal from the tx lo to the rx. That is why the spectrum in these figures are so spurius. I tried it with a proper signal generator and the problem I am talking about in this post still exists).
As you see in the first and last images the signal is sharp whereas in the second and third images the peak is spread as if the signal had phase noise. When I increase the input sinusoid frequency in steps of 1MHz, I observe that every third signal is a good sharp peak. (2002, 2005, 2008 MHz...and so on).
First question is do you have any idea why this happens?
Second question. My proposal is that the ADC calibration is not good (based on my experience on a similar system.). When I say ADC calibration I mean calibrating the data coming from the ADC to the clock supplied by the ADC using the IDELAY primitives. How can I set the delays of the IDELAYs? I hope it is programmatically accessible.