AnsweredAssumed Answered

FPGA <=> DAC AD9122 digital interface timing validation (VC707+FMCOMMS1)

Question asked by aismekalov on Mar 6, 2015
Latest reply on Mar 25, 2015 by aismekalov

Hello,

 

I am working with boards: Xilinx VC707 + Analog Devices FMCOMMS1-EBZ.

I have misunderstanding with digital interface timing validation: FPGA <=> TX DAC AD9122 (LVDS DDR 491.52MHz 16-bits interface).

http://wiki.analog.com/resources/eval/user-guides/ad-fmcomms1-ebz/interface_timing_validation

 

According to DAC AD9122 datasheet, digital interface timing can be verified by using the sample error detection (SED) circuitry.

SED scheme has two mode:

1) - usual mode: any errors are latched in flags (Reg 0x67 "SED control" bit 5 "Sample error detected", Registers 0x70-0x73 "SED I/Q LSBs/MSBs") and flags remain set until manually cleared.

2) - autoclear mode (AED autosample error detection): flags (Reg 0x67 "SED control" bits 5,1,0 "Sample error detected", "Compare fail", "Compare pass", Registers 0x70-0x73 "SED I/Q LSBs/MSBs") are autocleared by the reception of 8 consecutive error-free comparisons.

 

It seems me that first mode (latches without autoclear) is more preferred for interface timing validation if we want digital interface without any errors. So I am using first mode in SED circuitry in my C-funtion of DCI delay calibration.

Interface calibration gives NO errors for any DCI delay values regardless of the testing time for each data pattern (I tried testing time some minutes).

But when I turn off the SED calibration and provides my own signal (QPSK, bandwidth 15.36MHz) to the DAC AD9122, I see on the spectrum analyzer (after up mixer to 350MHz IF) that signal is corrupted at DCI delay code 3 (sweep time ~500ms, so there are many errors), and isn't corrupted at DCI delay code 0, 1, 2. I checked warning flags and level of FIFO: no warnings and level is ok for any DCI delay codes.

dci_calibr.png

 

So I have questions:

1) Why SED scheme does not detect the error at any DCI delay codes, although data is corrupted at DCI delay code 3 in normal operation? Where the signal may be corrupted at DCI delay code 3?

 

2) Why the second mode of SED circuit (with flag autoclear) is used in the DCI delay calibration function "int32_t ad9122_tune_dci(struct cf_axi_converter *conv)" of reference project? I think it is not suitable for this purpose.

ad9122_write(AD9122_REG_SED_CTRL,

                    AD9122_SED_CTRL_SED_COMPARE_EN |

                    AD9122_SED_CTRL_AUTOCLEAR_EN);                       <----------


3) How SED circuitry works?

    Could you give its block diagram and any details?

    At what clock (DCI or DACCLK) does SED works?

 

Thank you for any help.

Outcomes