Part selection of a matched transistor pair for a log amp - current range 100pA
to 10 mA
Is MAT01 or MAT02 better for this application, how does MAT02 behave at low
current levels and how does the accuracy vary with temperature
_Let's get the temperature drift of 5 to 10 lsbs in perspective by looking at a
few of the ADuC812 specs (pgs 2 - 4 of the datasheet)
Under the heading CALIBRATED ENDPOINT ERRORS,
the maximum offset error = +/- 5 LSB
the maximum gain error = +/- 6 LSB
These specifications are production tested at 25 deg C with a supply voltage of
+5V. The test is carried out after a software calibration routine of the ADC has
been executed at this temperature and with this power supply.
There are typical specifications to indicate the offset and gain errors with a
3V power supply but there is no indication of how the gain and offset vary with
changes in temperature!
The official advice is that if you want to maintain the guaranteed gain and
offset spec across variations in temperature, you should carry out a software
calibration when the temperature changes significantly. The on-board temperature
sensor is untrimmed and not production tested but you can do a two point
calibration (gain and offset) on this temperature sensor for each unit and do a
periodic check on the temperature. When you detect a significant change in
temperature, you can carry out a software calibration to ensure that the gain
(+/-5 LSB) and offset (+/-6LSB) are within the spec.