I want to ask what ADC (20-24bit) Analog considered the most accurate?
respectively, is a 24bit ADC significantly more accurate than the AD7763?
My apologies for getting back to you late.
Before I'll answer your query, there are few things I'd like to know first. Since you are asking about accuracy, what do you really mean? Is it DC accuracy?
I'll appreciate also if you can provide some details on the following:
I hope to hear from you.
Your query is being look at. We'll be right back with a reply.
Better accuracy is not necessarily achieved with more bits of ADC resolution. Some of the things that affect accuracy are the design of the ADC, and how it is used in a circuit (including sample rate, reference accuracy and stability, how it is driven by the input signal, PCB layout, etc.), among others. Also, some people take "accuracy" to mean something different that other people. By "accuracy" I am assuming you mean how far the actual transfer function of the ADC deviates from the ideal.
Typicalily, the two largest accuracy errors can be adjusted out. They are offset and gain errors. However, both of these have a temperature coefficient, meaning the errors change when temperature changes. The change is small, but whey you are looking to maximize accuracy, those changes may or may not be a problem. Linearity error (INL) is another thing contributing to accuracy error and it also has a temperature coefficient. The accuracy and temperature coefficient of the ADC reference voltage and of any amplifier or other things between the signal source and the ADC input will also affect circuit accuracy.
It is not at all possible to provide complete information in this forum about designing for best performance, but you should realize that going beyond 18 to 20 bits USUALLY provides very little in the way of increase accuracy. The most common reason for going to higher resolutions is for an increase in dynamic range. Realize that, in the IDEAL case, 16 bits is one in 65,536 counts, or an ideal maximum error of just 0.0015%, 20 bits provides an ideal deal maximum error of just 0.000095% and 24 bits provides an ideal deal maximum error of just 0.000006%. Those are pretty small errors and, in the grand scheme of things, there is no significant difference between 20 and 24 bits except for the dynamic range. Other things in the circuit will swamp out the accuracy differences between 20 and 24 bits.
Retrieving data ...