I already read this note: http://ez.analog.com/message/20550
But I still have some question about the tempeature sensor of the AD7195.
For the one poitn calibration, I understand I shall do:
Do a conversion (Temperature channel) at ambiant temperature (Ta). This conversion is Cr (Conversion real).
Ta leads to a nominal conversion value that is calculated using the nominal sensitivity of 2815 codes/°C.
Cn = (Ta + 273) * 2815 + 0x800000.
I shall now used a new real sensitivity (Sr) to convert the conversion codes in °C.
Sr = 2815 * (Cr - 0x800000)/(Cn - 0x800000)
Question: Do I understand well what to do?
My second question is when do is this calibration required? At each start of the ADC? Or is it OK if we do this calibration one time in factory at product calibration (Product = ECU using the AD7195) ?
Another question is: How can we know what is the Ta measured by the sensor for the calibration. If the sensor is deep in the chip or near a warm source, the temperature measured by the sensor will be very different of the temperature near the chip. I think it also depend of the time since start up. If the AD7195 just starts, it is still near the temperature near the chip, after 2 hours of work it is not the case.
Do you have some tables indicating the difference between the internal and external temperature during time? Or an estimation of this difference in runtime?
A last question more easy: The output rate of the temperature conversion is the one selected by the FS9 to FS0 of mode register?
I know the gain is switch to 1 and an internal reference is used, but I think the output rate parameter is still the same as the one used for the input channels, right?
And what conversion time? Is it each time Tsettle?
Thanks a lot