I have a custom PCB board with ADC AD9266 used in pin configuration mode (CSB is high, disabling the SPI controls). The ADC data driver output pins are connected to the Altera FPGA input pins. Both are on the same 3.3V power supply (ADC digital part), and the ADC analog power supply is separate 1.8V. The power supplies are noise free, with no spikes. The clock input to the ADC is 1.8V 10MHz clock with 50% duty cycle.
Input signal on ADC is from the differential amplifier (buffer), and for the test I'm using signal generator to drive sine wave of few hundred kHz. Analog input to the ADC is very nice differential noise-free sine wave. The problem is in the glitch of the output pins of the ADC, which is very nicely visible on D15_D14 when having the sine wave output. You can see on the attached picture, that the sudden drop to low level during the top of the sine wave is not supposed to be there. I must say that DCO output is glitch free, but only data pins exhibit this behaviour. This problem causes that reconstruction of sine wave in FPGA has the form of sine wave with random "spikes" to 0000h or FFFFh (or close to limits, because of the problem when MSB bits glitch to "0" or "1"). Timing of this spikes is not deterministic (not caused by some clock).
I'm not using buffer between ADC and FPGA (max length of traces between ICs are 40-50mm top, so didn't think I'd need one), series termination resistors are not helping at all (so it's suggesting it's not a signal reflection problem). Note: I did not provide an on board option to use SPI communication - the datasheet suggests that it's not necessary if pin configuration mode is sufficient for the application.
Any help is appreciated!