Post Go back to editing

High bit count ADCs

I am delving into the world of higher resolution ADCs. Anything up to 16 bits, is straight forward and easy to implement in practice.

Using the same breadboard (i understand noise and other issues) I did a cursory evaluation for instance, of the LTC2420 20bit ADC and the pin for pin replacement of the LTC2400 chip.

I seem to get the same practical resolution of approximately 18-19 bits for both chips.  (Used at the slowest sample mode available.)

Is this reasonable? Why would I use the 24 bit version if the 20 bit gives me the same apparent results?

I am trying to understand the differences because I would like more than 19 bits of actual resolution if possible.

Any insight would be helpful.

Thanks in advance.


  • Hi,

    Apologies for the delay. May I know if you are looking for specific ADC? or you just wanted to understand why would you choose a higher bit version when you are only looking for a 1-2bits lower than this? Is that correct?

    I would say it really depends on your application and requirements. The low bandwidth, high resolution ADCs may have a resolution of 16 bits or 24 bits for example. However, the effective number of bits of a device is limited by noise. We usually specifies peak-to-peak resolution, which is the number of flicker-free bit as most applications do not want to see code flicker on the system output. Some ADCs have configurable output data rate, gain, filter type etc. which affects the effective resolution. So correct, the lower the speed the better the noise performance so better noise free bit resolution. Usually, noise free bits resolution are detailed on the datasheet for each ODR, gain and filter type selection.   

    So if the 20bit ADC would suit your requirements with lower speed then that would be fine. We just usually used a much higher resolution if we wanted to capture the entire operating ODR range and if we don't want to see code flicker on your target resolution.