We in the analog world tend to think of the analog-digital converter (ADC) as the most technically challenging and bizarre animal to deal with. This is actually true to a certain extent. By virtue of the fact that the ADC "straddles" the analog and the digital worlds, there are design trade-offs and decisions made to keep the digital gremlins from making a mess of the analog meal (signal).
Given these complexities and other intricacies, I still have to say that ADCs have become pretty easy and "plug and play" to implement. However, there are situations where the ADC could stop working, or behave erratically. In these situations, it is quite normal to "expect the worst" scenario and start worrying about more complicated stuff within the ADC.
However, in these scenarios, I have come to agree with the theory of Occam's razor where the simplest theory might explain the ADC's behavior. This RAQ discusses exactly that.