AnsweredAssumed Answered

Dither in ADC

Question asked by DRS-des on Dec 15, 2013
Latest reply on Dec 19, 2013 by TomMac

To improve ADC SFDR, dither can be applied in 2 ways: low freq noise at the analog input, or  2) wideband noise from an internal DAC is summed with the signal and the digital output is corrected atfer sampling.


When I try the low freq analog input noise approach with various ADcs, it seems to degrade the close-in noise around the main signal, as if the sampling clock of the ADC is much worse.  ADC sample clock =128Msps clock; the input signal is -1dBFS tone between  4 to 40MHz; I add low freq noise - DC to < 100kHz with a peak < -20dBFS. The noise does not cause clipping of the ADC.  The ADC samples are processed by FFT - we see the intermod spurs  from 2nd  & 3rd harmonics (H2, H3 etc) reduced , but the noise close to the main carrier has increased. It acts as if the system LO phase noise and/or the ADC sampling clock have higher close-in phase noise.


If there is non-linearity in the ADC front end and quantization process, then this could allow low freq noise in the dither signal to cross-modulate onto the main signal. This would side-step the assumption that the low freq added noise can be filtered easily from the desired passband. Is this correct?


How would I model adding a low freq dither onto a main signal in VisualAnalog? There is a noise generator and a filter component, but I do not see how to to combine them.