I am looking at a few of the LTC/AD SAR ADCs for a new project. I haven't decided which particular parts I will order and experiment with just yet. The ltc2378, 2387, 2380, ad7960 look interesting. But I am wondering if the distortion and noise performance of these newer SARs (in general) might be improved by applying a large(ish) external subtractive dither.
I presume the ADC outputs are already dithered internally. I am guessing this is a pseudo-random sequence that is 'non-subtractively' added to the output? If I were able to determine/obtain the internal sequence used, would it then be possible to remove/cancel it and return the LSBs back to their original undithered values? I would then inject the larger dither upstream, 'chopped' in between the acquisition periods (fs/2), and subtract it at the output.
From my reading, I understand this sort of dithering approach cannot improve the distortion in the driver/buffer at the front end, and that makes sense. But it can clean up errors in the S+H if I understand correctly. So, assuming very linear amps are used and the ADC itself is the constraining distortion element, might there be gains to be had?
A good implementation might be a bit of work. But if it could kill more of the spurious rubbish (particularly as input amplutudes head towards 0dBfs) and also gain a bit of extra SNR from the subtraction, it might be worth the extra design effort?
Dither appears to be magic (/witchcraft)... especially subtractive dither, which seems like cheating. SARs look like they might be good candidates for this approach? Am I missing something? What are your thoughts?
Thanks and happy end of 2020,