Q
For an automotive application I need to have just a confirmation.
The "0 g Voltage at Xout,Yout" in the "Zero g BIAS LEVEL" specifications on the
ADXL322 data sheet, have a value from 1.3V to 1,7V.
This value is also the maximum difference possible between Zero g Xout and Zero
g Yout or the value "0 g Voltage at Xout,Yout" is the same for the Zero g Xout
and Zero g Yout ?
In the second case the maximum difference at Zero g between Xout and Yout is
only "0 g Voltage at Xout,Yout" more "Initial 0 g Bias Deviation from Ideal".
It's correct?
A
The 2 axes are independent (and vary from part to part) .
Our test method uses a min/max voltage output over all temperatures.
So this encompasses the initial zero g error as well as the temperature
coefficient.
The zero g bias level is the output voltage of the ADXL322 when there is no
acceleration (or gravity) acting upon the axis of sensitivity.
The output offset is the difference between the actual zero g bias level and
(VS/2).
The sensitivity temperature drift is the change in the gradient of the transfer
function over temperature.
This is like a gain error in a linear amplifier. It is independent of offset.
Any offset will not change the sensitivity drift because they are two separate
things being measured.
If the 0g output at 25degC is output = offset
then the 0g output at 85 degrees is
output = offset + (offset drift x (85-25))
The full scale output (@ 5g input at 25 degC will be: output = 5g x
(sensitivity + sensitivity error) + offset
The full scale output at 85 degC will be: output = 5g x (sensitivity +
sensitivity error) x (1+ sensitivity drift) + offset + (offset drift x (85-25))
The sensitivity drift of 0.01% is much less than the offset drift with
temperature, so, by compensating for offset temperature drift, you take care of
the dominant temperature error source.
If you calibrate to remove offset and sensitivity errors at 25 deg C and then
implement an offset temperature compensation scheme, you will have eliminated
the error.