ADXL355 sample rate tolerance

I recently completed a 23 day study comparing the ADXL355 to a very high end accelerometer from another company.  The ADXL performed very well in terms of magnitude and reliability.  In order to match the other accel we had to adjust the set sample rate of the unit.  We needed to adjust it by +1.2% which seemed very high.  I looked through the spec for a tolerance on the internal clocks but I didn't see something I could convert to allowable tolerance on the rate.  The rate was very consistent so I don't think there is internal drift.  I am curious what the tolerance of the frequency setting is.

  • Thank you for your post.  I am sorry for the delay in our response.  We are on holiday this week, but I can say that I was able to find some original design documents, which projected a tolerance of +/-2% and that the temperature variation, over -40C to +125C, would be less than 3%.  I have not been able to find any drift metrics on this metric, but 1.2% sounds fairly high, assuming the 23 days were under "normal use" conditions, rather than accelerated aging profiles. 

     - When you return, can you comment if you have any additional information on this metric? Thank you! 

  • Thanks for the reply.  The 23 days was in continuous use in a temperature and humidity controlled environment.  Over the test there was no sample rate drift, the frequency was adjusted to Fs x ~1.2%.

  •   was this deviation from the nominal ODR constant over the sampling period you tested ?

  •  I believe the answer to your question is yes.  The desired rates were selected via the first 4 bits of register 0x28 (datasheet page 38).  The two sensors didn't have the same gradients of sample rate.  In order to line them up in time, say 120 vs 125 Hz, we needed to add ~1.2% to the ADXL.  Basically instead of 125Hz the rate would be 126.5 Hz.   After making that slight change all the peaks would line up in the time domain.  We expected some delta from nominal, just curious what the expected tolerance is.

  • Thanks for your reply. It seems that those internal MEMS oscillators are generally not so accurate. I have seen it before. So you have to measure the actual ODR using a more accurate clock from your MCU platform and resample the record (e.g. Lanczos) during postprocessing in order to get something more standard like 100 Hz. Otherwise you need to use an external clock on the ADXL which seems a bit complicated.