related question: http://ez.analog.com/message/138883
So I am doing shock testing using an ADXL001-500 and a DAQ (links to data sheets below) and I get the feeling that I something is not going quite right along the way. The idea is for each drop of an object, to get the maximum amount of G force it will experience. I am operating at a 5V input, and the accelerometer zero-offset voltage is about 2.32V +/- 0.04V (measurements while sitting stationary on a table). On any given drop I take the minimum and maximum voltage experienced, take the absolute value of (min/mix minus the offset of 2.32), and then convert the larger number from a voltage into Gs by dividing by the accelerometer sensitivity (in mV/g). Once finished I have a max Gs number that doesn't seem to be quite right.
To give an example, say on one drop I see a max V of 4.4719V and a minimum of 1.6184V.
For this I would use the max V, with a difference of 2.1479V
The sensitivity on the data sheet for the adxl001-500 at 5V says 3.3mV/g = 0.0033V/g
To convert the max reading into Gs I would do 2.1479/.0033 = 650.8788
This makes me scratch my head since this is only supposed to be a +/-500g accelerometer
Any ideas as to why this might be happening or what I should do to fix this? I'm still relatively new to all this. Thanks for any and all help.
links to datasheets: