Affect of buffer amp on ADC distortion?

Does the buffer amplifier affect the distortion of the ADC? If so, by how much? Which amplifier specifications must I consider?

  • The input source usually needs a buffer amplifier with low output impedance for isolation from the ADC's input impedance. This buffer’s output impedance affects the ac performance of the ADC, especially the level of total harmonic distortion (THD). A high source impedance increases THD, because the input impedance of an ADC with inputs swinging 2.5 V will usually have tangible, nonlinear input capacitance.

    The THD is degraded proportionally to the source impedance. The maximum allowable source impedance in series with the ADC’s input depends on the amount of total harmonic distortion (THD) that can be tolerated. The ADC driver also needs to have very low inherent THD, well below that of the ADC (i.e., better than 16-bit accuracy). For example, the combined THD level of the AD7671/AD8021 is typically –100 dB at both 20 kHz and 250 kHz.

  • This question has been closed by the EZ team and is assumed answered.