I'm using a Hall Effect sensor (LEM HAIS-100P) to measure current. I'm using an AD820 as a buffer amplifier to give a low-impedance reference to the AD8422. I built my circuit as seen in the ADISim pic below. The output (where the 1k resistor is) goes to an Arduino Mega.
At 0 A, the Hall Effect sensor is putting out around 2.5 V which goes into the non-inverting pin on the AD8422. The sensor is also putting out 2.66 V on it's reference port (ideally 2.5 V). So my thinking was, at 0 A,
Vout = 1*(2.54 - 2.66) + 2.66 = 2.54 V, which in the simulation it does.
The sensor is ran on a single 5 V supply, and maps 1.875 V to -100 A and 3.125 V to 100 A. With a gain of 4 (as close as I can):
Vout @ -100 A = 4(1.875 - 2.5) + 2.5 = 0 V
Vout @ 100 A = 4(3.125 - 2.5) + 2.5 = 5 V
Right now, I am just using a G of 1, and after I put it on a breadboard, I'm only getting 1.94 V at 0 A on the output of the AD8422. The output of the AD820 is great. I've tried pulling down voltages, making separate paths to ground just in case the current sensor didn't have one, and nothing changes.
What am I doing wrong? I'm using SOIC to breadboard adapters for both ICs and loose wires around my house, but the continuity is good. What can I do to get an error less than 25mV (what I need in order to get 1 A accuracy)?