The output Amplitude of my AD9913 changes over frequency (for the same DAC code the amplitude is not the same).
The Ref_Clk is a 20MHz TCXO and is multiplied by 12 -->System Clock =240MHz.
The level seems very small, is that still correct?
Why is the amplitude changing? And what can i do against it? Is there a function behind it, so that I can calibrate it myself?
all measured with DAC Code:1023 (which corresponds to the highest output current)
I checked the transformer that you are using and the optimum bandwidth that it operates is 0.3 - 200MHz. So it lead mo to suspect the cut off frequency of your filter. When you designed the filter, what is your cut-off frequency? It might be to low, thus attenuating the rest of the high frequencies.
thank you for the answer. The cut-off frequency was to low. I measured again without any filter and this were the results:
-14dBm at 30MHz
-15dBm at 60MHz
-16.2dBm at 90 MHz
This is a lot better, but there is still a change in the level over the frequency. Some attenuation of the signal level as the output frequency increases. Is this because of the DAC (sin(x)/x attenuation profile in frequency domain)?
And to my other question: -14dBm to -16dBm seems still very small. Regarding to the DAC Code of 1023 which corresponds to a DAC output current of 4.57mA and an amplitude of 228mV (50Ohm) there should be -3dBm at the output. Am I right? If so, what could be the problem?
Given a 240MHz sample rate, the sinc roll off from 30MHz to 90MHz is ~1.9dB, which is in good agreement with your measurements without the filter.
By the way, assuming my math is correct, your filter response (below) is not really a low pass response. The purpose of the filter is to suppress Nyquist images by constraining the pass band to ~40% of the sample rate (~96MHz in your case). The filter you are using exhibits poor stop band performance.
Elliptical filters are usually the best choice, which is typically what we provide on our evaluation boards. A 5th order elliptical filter with a cutoff frequency of ~90MHz is a reasonable choice, but a 7th order filter is even better.
After further analysis, it turns out my math was a little off. Below is the accurate response (validated through simulation). Note the above plot was normalized so that the maximum level is 0dB. The actual maximum level was approximately -9dB (about 7dB higher than the new "accurate" plot below.
Thank you very much. This is very helpful. I am going to redo my filter design.
To my other question: the output level is something about -15dBm without filter. Is this a avergage value for the AD9913 and I have miscalculated with -3dBm?
Using dBm to determine the output level is not very useful.
The device puts out a current. The current develops a voltage on whatever load is connected to the output pin. Assuming nothing but the 50Ω load (R) is connected to the output pin and you measure the pk-pk amplitude (Vpp) with an oscilloscope (1MΩ input), then the output level (power delivered into the load) in dBm is:
30 + 10log[Vpp^2/(8R)]
Hence, given nothing connected to the output pin other than a 50Ω load and with 632mVpp measured across the load, then the above formula yields 0dBm.