Dear all (especially danf )
We're working on an AWG board consisting of two AD9129 DACs and a Xilinx Kintex FPGA. We developed the board on our own. The board is working quite good but now I'm facing an issue and I'm not sure what to to about:
The output of the DAC is supposed to generate RF pulses (for radar applications) which means that a trigger signal is generated by the FPGA and at the same time data is transferred to from the FPGA to the DAC to generate an RF sweep with a duration ranging from a few us to several ms.
The delta time between trigger and signal output at the DAC must be constant but each time I re-configure the DAC (with the exact values from the datasheet pg. 54) the time difference jumps one DCO/DCI cycle back and forth randomly: sometimes delta from trigger to DAC output signal start is 60ns... sometimes its 62.13ns. The difference of 2.13 ns is exactly one DCI period. (DAC is clocked with 1875MHz => 1875/4=468.75).
Now I'm not sure how to find out what might be the problem: is there a problem with the FPGA logic or must I tune something within the DAC?
1) data and trigger are generated in a state machine. DATA and DCI is put out to the DAC via OSERDES cores (2 OSERDES for port A and B and one OSERDES for the DCI clock). The DCO from the DAC to the FPGA is used to clock the OSERDES elements. I got this idea from a Xilinx App note that suggested to do so ...
2) The OSERDES are specified by Xilinx to have the same timing characteristics (delay to output pins) when used in parallel. Also trace lengths on the PCB from FPGA to DAC are tuned to have the exact same length. Referring to figure 131 on pg. 40 this should meet the DAC spec which demands that the DCI and DATA edges are aligned.
3) When I tune the DLL Phase offset in register 0x0A I can see that it has an impact on the beavior (0x00 has a higher tendency to produce DCI cycle jumps between trigger and data output than 0xCC. negative DLL phase values work better than positive ones)
Can anyone suggest what might be a good idea to further investigate this problem? Can the FIFOs in the DAC cause this behaviour? Must I generate the DCI clock in a different way?
a) Please can anyone explain the functionalities of the DLL in a little more detail? What are the exact differences between adjusting the DLL phase offset (register 0x0A) and adjusting the Delay line middle set (0x0B) ???
b) according to the start-up sequence settings the minimum delay is set to dec 4, the maximum delay is also set to dec 4 but the actual delay line middle set is set to 9 ? some this doesn't make sense to me unless I'm missing something ... what I most likely do ...
Best Regards and thanks for any help!