A few questions about this DAC:
1) Based on the timing characteristics of the digital interface for Vdd = 3.3V, it should be feasible to update the LTC2757 at 16.66 MHz. However, if the output settling time specification in the datasheet is 2.1 us, does this mean that I should not being updating any faster than 476 kHz (or 555 kHz if optimizing settling time to 1.8us per Note 7)?
2) Is the settling time dependent on the step size? The datasheet states conditions for a span code = 000, which is 0-5V, but also states for a 10V step. In my experiments on the eval board, the size of the step didn't seem to affect the settling time.
3) I am having trouble understanding the graph of multiplying frequency response vs digital code. Why is there an individual line per bit rather than a single line for attenuation vs frequency?
1. You are talking about two different things, you can update the DAC at 16.66MHz because that is how fast the SPI lines and registers can go. The DAC settles in 1.8us so that is how fast it takes for the signal to settle. If the DAC isn't fully settled before you change the code, then you will get an error.
2. Yes the settling time is code dependent. A full scale step will take longer to settle than a small code change. A 0-5V step will be quicker than a 0-10V step. It has to do with the slew rate of the final internal amplifier.
3. If you want the frequency response you can use the All bits on plot. The other plots are there to show the response with different multiplying factors.
1,2) If the settling time is code dependent, wouldn't it be feasible to increase the update rate by limiting the step size of each code change?
3) Is it correct to interpret the "all bits on" plot that there is approximately 3dB of attenuation with a 1 MHz code input? If so, what would be the update rate that could even be used to gather this data? What about for the 10 MHz data point?
I had the test concept of the multiplying frequency response vs digital code explained to me. You can ignore my questions on point 3.
The update rate is limited by the digital circuitry in the DAC. By going code by code or going into smaller steps the settling time might improve, but if you need to have a full scale step you would need to step through so many codes you would lose the advantage of the decrease settling time. Once again, there is no free lunch.
My reasoning for trying to increase the update rate at the expense of smaller code steps is to move the spectrum of digital noise and glitches from the hundreds of kHz range to the several MHz range for stronger filtering. Would this not be a reasonable thing to do?