I'm trying to reconcile some documentation on the AD9361 temperature sensor.
The AD936xTemperaturesensor app note gives an example with an offset (Register 0x00B) = 0 and a temperature word (assuming this is from Register 0x00E) = 61 decimal. The calculated temperature varies, but Example 2 gives a measurement of 46.5 deg C (worst case).
The AD9361 driver sets the offset to 0xCE = -50 decimal. So the temperature word from the above example would read 11 decimal instead of 61. The ad9361_get_temp function divides that value by 1.14 and reports that as the temperature. 11/1.14 = 9.65 deg C.
The AUXADC CODE vs Temperature graph in UG-570 provides a third interpretation. With 0x00B = 0 and an AUXADC CODE = 61, the temperature from the chart would be -3 degC. Is AUXADC in the chart the value read from 0x00E?
Which of these is the correct interpretation of a temperature word = 61 with an offset of 0? (These discrepancies are larger than the uncertainty in the temperature sensor, so I'm obviously missing something).
Is ad9361_get_temp intended to return the die (junction) temperature or the case temperature or ambient temperature?