Let me first give a quick overview of the system. I am using an AD7983 ADC to read 20 multiplexed signals. I am running the ADC at 1MSPS. I have a diode clamp to ground prior to the buffer driving the ADC. This keeps the input to the ADC to about -0.6V during a negative overvoltage further up the signal chain. This is above the required -0.3V required by the AD7983, but I am limiting the current into the ADC to less than 150mA as specified on page 14 of the datasheet.
When I have a negative overvoltage on one channel (lets call it CH1), and 0 Volts (ground) on another channel (lets call this CH2), I see an odd phenomenon. If I switch in CH2 (0V), wait for it to settle, send the convert pulse high, wait the required 10ns for the ADC to go into conversion mode, and then switch the channel to CH1 (-0.6V), an offset appears on my sample. This offset is around 6.5mV. It is as if the negative overvoltage is influencing the conversion of the previous channel.
The datasheet and app note AN-931 specifically state that the inputs are disconnected from the comparator during the "Conversion" phase.
Best I can figure is that there is some odd forward biasing going on, and despite the inputs being disconnected from the comparator, the negative overvoltage is still influencing the previous sample.
I notice that if I increase the wait time between switching the mux, the previous sample starts to lose the offset. Around 120ns, the offset is gone. Since I am switching the mux at 50kHz and sampling at 1MSPS, 120ns of wait time is a good portion of my settling time (12%).
Can you please explain what is going on here, how I can predict what will happen over different lots and over temp, and if there is a solution (besides waiting longer)?