I am using the AD8237 in a design to measure very sensitive differential sensor responses. I have this as the first stage input to my analog front end. This part works great with very small signals, but now I'm noticing as I increase the input frequency OR increase the input signal magnitude the 8237 output goes from looking perfectly clear to oscillating between 0V and Vcc which is 3.3V.
Here is a basic schematic picture of the circuit:
The entire system including the 8237 is powered by Vcc=3.3V. The reference is set to 0.5*Vcc with a 10k resistor divider. The 8237 drives another stage that is a 10kHz low pass filter consisting of a sallen key topology. My goal is to AC couple through as much signal content in the ~0 to 10kHz range as possible. I chose this device because it was really the only 3.3V capable single supply instrumentation amp I could find and was recommended by AD a while ago when I ask for recommendations.
The output loop is closed by a digital pot. It goes where it says "U4 Pot Here". This is an AD5272-100 and the output of the 8237 is connected to pin 2. Pin 3 of the 5272 connects to the FB pin of the 8237. I just change the POTs value to get my gain. Hopefully that makes sense even though it isn't shown in the schematic.
Here are some screenshots to illustrate what I'm talking about:
This screenshot shows the input (blue) and the output (yellow). The IN+ channel is the blue trace. The IN- channel of the 8237 is grounded. I'm just looking at this sine wave coming from a signal generator which is single ended not differential so that why I grounded the negative input pin of the 8237. I think this is ok since the inputs are AC coupled. This is what I see from 1 kHz to 8kHz. everything looks good.
When I bump the frequency up to 15kHz I see this. I pretty much start seeing this right after 8kHz.
Here is a zoomed in picture showing the large oscillation.
Now what I've also noticed is that when I attenuate the magnitude of the input signal and leave the frequency (here it is still 15kHz) I will get to a small enough input signal where this goes away.
If I increase the input even the smallest bit in the last screen shot I see this.
I have the amp in low bandwidth mode which says it can handle a Gain of 1 and the POT in the feedback is set to it's minimal value in all of this so the gain should be fairly low (between 1 and 2). Unfortunately I've spent so much time testing very small input signals I've missed this problem.
I've tried putting a resistor 100R up to 1.4k in series with the output of the 8237 because I've had an amp driving a capacitance in a POT before that caused it to oscillate.
I've also noticed that increasing the gain by increasing the value of the POT allows me to output a larger undistorted signal. It looks like this is based mostly off of input frequency and input magnitude.
For example: I can input a small signal and amplify it with a large gain ~80 and everything looks ok. If I instead try to pass the same magnitude signal through by increasing the magnitude of the input signal (setting the gain to 1 roughly) it becomes distorted way before I approach the same Vpp from using the gain. Said differently it looks like as I increase the POT's resistance I can apply a gain that gives me a fairly large output signal. But I can't pass anything more than a few millivolts at the input.
Does anyone have any insight into what is going on? It probably is something obvious to an analog designer. Ideally I'd like to be able to allow customers to input a sine wave just to test out the channel and have it behave. Theoretically I thought the data sheet said it had a gain bandwidth product of 10*20kHz or 200? Any help is greatly appreciated!