I have an AD8421 instrumentation amplifier running on power rails +5V and ground. The input signal has a common mode value of 2.929 Volts with a small differential signal, the inputs are at 2.928 Volts and 2.931 Volts. A differential of 3 milliVolts. The gain is 100 and Vref is set at 2.500 Volts. The output sits at 2.817 Volts (wrt ground), which is reasonable (317 millivolts above the Vref of 2,500 millivolts).
Looking at figure 13 of the data sheet (and shifting the axes to represent 0-5Volts supply rather than +/- 2.5Volts) the operating point is well within the red hexagon drawn.
The problem is this; the inputs are fed from a pair of RC low pass filter (the whole application is DC signals) comprising a 1K series with a 100nF cap to ground. If you measure across the 1K resistor a voltage of 6 millivolts, or a bias current of 6 microamperes, is flowing into both the inverting and non-inverting inputs. This is the problem. The bias current for these devices should be a few nano amperes. This voltage drop across the resistors is spoiling the DC accuracy of the circuit.
Furthermore, if I increase the common mode voltage to 3.5Volts with the same 3 millivolts differential then the bias current (measured in the same way) increases to 190 micro Amperes. This is really a problem! Again, according to Fig 13 of the data sheet this should be OK.
Any ideas why this should happen? I have done the obvious like replacing the silicon. When I remove the part the voltage across the 1K resistors drops to zero, so it is not the caps.
Looking forward to help!