In production we do a calibration of the ADXL362 by taking static +/- 1G and 0G readings for each axis. We then do a calibration check to make sure they are within the expected range for the ADXL362. The data sheet specifies the nominal sensitivity to be 250 LSB/g for +/-8g range, with a sensitivity calibration range of +/-10%. We have our limits for each axis set to 220-280 LSB/g, which is +/-12% of the nominal 250LSB/g. We are getting some units that have sensitives as low as 215 LSB/g. I want to verify that this is expected, and why we might see this behavior.
To get the sensitivity for each axis we take the +1G reading minus the -1G reading and divide by 2.