I have a question regarding the Power Supply of the Instrumentation Amplifier AD8422. I’d like to power the device with
+-12 V generated by a DC/DC-Converter. Due to the DC/DC conversion the power supply contains Voltage peaks with a frequency of 300 kHz (Figure 2). As a result of the disturbance on the power supply the DC Input Voltage (approx. 700 mV) of the In-Amp, which should be amplified with a Factor of 10, has the same disturbance patterns on its output (Figure 3). So my question is: Is it normal that this kind of disturbances has this severe effect on the output voltage of the amplifier?
Description of the Schematic:
The goal of the schematic in Figure 1 is to amplify the Input Voltage, which is proportional to Temperature and has a range of 500 mV to 700 mV, by a Factor of 10. I’d like to achieve accuracy of U < 100 µV on the output. The input Signal is differential and has a maximum frequency of approx. 25 kHz. The source of the input Voltage is a low-impedance one.
Thank you for your help!