The ADXL354 data specification indicates that the typical value of the 0G temperature offset is |0.1|mg, and the maximum is |0.15|mg, so the typical value of the offset between 25 °C and 125 °C can be calculated to be 100*0.1mg=10mg.
But why is it that in the test picture in the datasheet you provided, the maximum XYZ offset is only 3.12mg, then the actual typical value should be 3.12/100°C(125°C-25°C)=0.032mg/c, but isn’t it?
Can you explain your method of calculating typical values and maximum and minimum values from experimental data? Thank you and have a nice day
www.analog.com/.../adxl354_adxl355.pdf
datasheet
[edited by: yaro-0916 at 6:32 AM (GMT -5) on 5 Nov 2024]