I am measuring the output of two pressure sensors (0-5V output divided down to ~ 0 - 1V using 2k/10k resistor divider) using ADC3 through the primary ADC and ADC4 through the auxiliary ADC, both in SE mode referenced to ADC5. Both ADCs are referenced to a 2.048V precision external reference, with the high reference bit set. I notice I get significantly different results from the two converters. With an input voltage of ~550 mV on ADC3 and ADC4 the output code for the primary ADC is ~6845000, and for the auxiliary ADC ~11698000. Why are these so different?
Here is a code snippet for the ADC set up and conversion:
Could you post your schematic? Both ADC results look incorrect.
I would like to determine what common mode voltage you are using?
Just to add - your code setup looks fine.
I do not have an easy way of posting the schematic, but to describe it (and this maybe my problem) ADC5 is tied to analog ground, and ADC3 and ADC4 both have a 2k resistor to ground and a 10k resistor to the sensor output.
I noticed on the data sheet for the auxiliary ADC in SE mode that ADC5 has to be biased above 0.1V. I can try that, but does that imply that I cannot measure input voltages below 0.1V? Is this also true for the primary ADC?
Thanks for the help!
Quickly draw a picture with Paint or take a screen shot of your diagram or scan a hand-drawing ....
Attached are the schematics of our prototype. The analog reference voltage is supplied by an ADR440 at 2.048V and AD5 is grounded on the detector board.
If it appears that my issue is a hardware issue, can you give some guidance on the correct way to implement single ended inputs on the primary and auxiliary ADCs?