I am using the DC1525A-A evaluation board (LTC2175-14) with a Xilinx SP605 FPGA board for direct digital phase noise measurement using the cross-correlation averaging method:
I feed a a DUT oscillator into two channels via a power splitter, and two independent high-quality reference oscillators into the other channels. I then measure phase difference between DUT and REFs, thereby cancelling clock jitter, and perform FFT, cross-correlation and long term averaging.
I originally used the 16-bit serialization mode but recently switched to 14-bit serialization in order to reduce the high-speed data rate.
My setup detects a very small systematic measurement bias, which until recently produced anti-correlation between the two independent measurements. But since switching to 14 bit mode, the bias has reduced and reversed polarity!
I wonder if the clock duty cycle stabilizer could be the cause? May we know more about how it works? Is it a digital scheme that operates at the DCO frequency? My DCO rate has changed by a factor of 7/8.
I drive ENC with a high-purity sine wave and have modified the DC1525A-A board as per the ADC datasheet to accept sine wave drive.
I have searched this forum for information about DCS and found only two statements relating to other parts, one saying it adds jitter, and the other saying a small pseudo-random charge put on the sampling capacitor is not cancelled. Any pseudo-random process that is common (in phase or anti-phase) between two channels will be detected by cross-correlation averaging.