I am using the AD9954 to generate a continuous sine wave of 1 kHz, using an external 10 MHZ clock. I have chosen the DAC output network based on the excel spreadsheet found in the TI
"Interfacing op amps to high-speed DACs, Part 1: Current-sinking DACs" link here. I have set the output current to be 5mA, AVDD, 1.8V, 1V pp for Vdac (on IOUT pin) and Vout (after op amp) Below is a screenshot of the circuit.
Unfortunately I have measured a poor THD on the scope, around 3% or -30dB, which I am assuming is a result of the design of the filter circuit. I suspect the voltages are out of compliance for the IOUT pins, but I am not sure.
I have included a screenshot from the scope: Yellow is Vout (after the op amp) and the Red and Blue are IOUT and -IOUT. The clipping in the IOUT signals are clearly visible.
Any help you could offer would be greatly appreciated. I was expecting to be able to get something more like -60dB THD at least!
The IOUT/~IOUT pins are VDD (1.8V) referenced (that is, the internal current sources sink current from an external VDD source). Based on your schematic, I calculate the dc-bias point at the IOUT and ~IOUT…
The problem is the dc-biasing, which you must ensure is correct. Note the MT-019 diagram will not work as shown because it is for a current sourcing DAC (the AD9954 uses a current sinking DAC).
My initial post was incorrect. I used 5mA instead of 2.5mA to calculate the dc-bias. My bad. A recalculation yields 1.8V dc-bias at each IOUT pin (in agreement with your simulation).
That said, I would…
The IOUT/~IOUT pins are VDD (1.8V) referenced (that is, the internal current sources sink current from an external VDD source). Based on your schematic, I calculate the dc-bias point at the IOUT and ~IOUT pins at about 1.2V each, which is 600mV below VDD.
The voltage compliance limits are VDD ± 500mV, but the dc-bias point is VDD-600mV. This constitutes 100mV outside the compliance range, hence the excessive distortion.
You will need to redesign the output circuit to provide 1.8V dc-bias and constrain the swing to 500mVpk on each pin (IOUT and ~IOUT).
Hey, Thanks very much! Its weird because the calculator in that spreadsheet from the TI app note takes the VDAC DC Bias point as an input, which I set to VDD. So it should have provided values which has the correct bias point. I was reading more around the subject on other Q&As here, and I noticed a difference in the recommended circuits for DAC interfacing. I checked in the TINA/Spice model they also provide, and I get 1.25V DC bias.
The one I used is this one:
but MT-019 suggests to use the one below, with a slightly simpler Biasing, and a capacitor filter:
Is there any reason to suggest one might perform better than the other?
Further do you think I would see a noticeable improvement if I switched to a 10mA output instead of 5mA?
Thanks again for your help
10mA provides best spurious performance, but 5mA should be fine. Even with 5mA you should see harmonics at levels below -50dBc.
A slight correction to my earlier post...
Because you are using an external source of 3.3V, you can get away with dc-biasing the IOUT and ~IOUT pins at 2.3V (for VDD=1.8V). In this way, when an output pin is sinking 0mA, the output will be at 2.3V. However, make sure the equivalent load resistance is such that the output is 1.3V when full scale current is flowing. This will provide 1Vpp swing while still satisfying the voltage compliance.
silly mistake about the source/sink differences!
Thanks for clarifying the point about the DC biasing. I think there is one point I still dont quite understand, is how to calculate the value of the DC bias? I made a LTspice model which, for my values, the two VDAC+/- are centred around 1.8V link to ASC file, but you mentioned 1.2V earlier based on this circuit.
which gives the following output:
The output here is as you describe it should be - VDAC (blue line) is 2.3V when 0mA is sinking (light blue line), and 1.3V when 5mA sinking. Whereas in reality, when looking at the scope in my original post it is between 1.2 and 2.2 V (clipped).
Could you give me a hint as to what im missing?
That said, I would expect your original circuit to yield much better results.
Also, in my 2nd post, just to be clear, the 2.3V bias applies when the DAC sinks 0mA (not the 2.5mA center point bias).
As an experiment, try recalculating the resistor bias network to yield a 1.8V center point, but a peak swing of 250mV (instead of 500mV). If that solves the problem, then it means you were just operating too close to the voltage compliance limit.
Thanks for the clarification! I understand it now and feel a lot better
I had a similar thought and I tried reducing it to 250mV on IOUT and I get much better results! I also changed the gain to bring it back up to 500mV on the Vout pin as before.
I now get 0.126% or -58dB THD as measured on the oscilloscope - about 20 times better. Thanks a lot for your continued help!