We design a data acquisition system and noticed an overshoot with a ~3us time constant in the ADC data. It is a small effect but it biases our data quality. I spent some time in identifying the source of the time constant.
The source of the time constant seems to be the ADC AD6645.
AD8138 configured as a single ended to differential amplifier drives the AD6645 sampling at 100Ms/s
The signal source is a pulse generator with 50ns rise time (our application is in the time domain).
I have 2 plots showing the signal traces. One is an overview and the second one is zoomed in on the overshoot.
x axis: time in 200ps/step (scope). The AD6645 is stretched to match the scope time base.
y axis: is in Volt scaled and offset to the AD6645 input
The traces are:
1. single ended input to the AD8138 with a differential probe between the signal and ground.
2. input to the AD6645 with a differential probe between AIN and AINbar.
3. data read from the AD6645.
As the plots show the AD6645 data has an overshoot which is not present at the analog input. The time constant seens pretty long for an ADC (10000 x axis tics = 2us).
Is there something we might do wrong in the operation of the ADC or is this intrinsic to the ADC?