AnsweredAssumed Answered

AD7730 internal calibration problem

Question asked by drdailey on May 15, 2014
Latest reply on May 27, 2014 by JohnnyG

I am using the AD7730 bridge transducer ADC chip as the excitation/measurement device for a 100 Ohm RTD sensor application.  I actually used the reference design published the Analog Dialog 34-5 (2000), "Transducer/Sensor Excitation and Measurement Techniques", by Albert O'Grady.  I am using the exact schematic depicted in Figure 9 (for now).

 

http://www.analog.com/library/analogDialogue/archives/34-05/sensor/index.html

 

After some wrestling with the device setup, I have the circuit working much as the article describes.  However, I am running into an issue with the internal self-calibration.

 

My device registers are configured to use AC excitation, low-voltage Vref (nominal 2.5V), and an input range of 20 mV (uni-polar).  I also am using the internal DAC offset set to subtract 10 offset counts (-9mV).

 

On start-up. my code sets up the FILTER, DAC, and MODE registers as described above, and then initiates internal calibration.  As per the datasheet, I run an internal full-scale calibration using the 80mV input range, then an internal zero-scale calibration on the  20 mV input range (which is used for subsequent measurements).  After calibration, I set the chip to do continuous conversions on the analog input.

 

My equation to convert the A2D result into the RTD resistance is:

Rrtd = Rref * ( N / FS / G  +  D / 2000)

Where: Rref=18k (the resistor for generating the Vref input),  N==AD7730 return code (24 bits),  FS==fullscale count=2^24, G==the PGA gain, and D==the offset DAC magnitude (10 counts==9mV)

 

The datasheet for the AD7730 does not state exactly what the gain is for the PGA in its analog front end, but I assume that the Gain is G = Vref(nominal) / inputRange = 2.5 / 0.020 = 125.

 

Using the above set up and an external decade box in place of the RTD, the measured values are quite linear, but follow a slope with a PGA gain (G) of about 136, rather than the expected 125.  However, if I reset the part, and skip the internal calibration routines, my results do indeed follow the expected slope, with a PGA gain of 125.

 

So, somehow running the internal span calibration is changing my slope by roughly 8%.  My numbers without the calibration actually look pretty good (although there is still an unexpected offset to the results). 

 

Has anyone run into an issue like this?  Could the fact that the actual Vref is 1.8V instead of the nominal 2.5V affect the span calibration?  I would expect the internal calibration to simply divide down the Vref input, and adjust the output span to correct the gain, so the actual value of Vref shouldn't matter.  In any case, since this reference design was contained in an AD application note, I am hoping someone can shed some light on how this circuit should be configured and calibrated.

 

Thanks,

Doug

Outcomes