I have a ADXL001-70z eval board and placed a 15k/10n (1kHz) RC filter. I supplied the board with a noise free 3.3V power supply. I glued the eval board on another vibration sensor (407860 - Heavy Duty Vibration Meter) and together they were glued on to a vibration source, an unbalanced bench grinder. This way I could compare the ADXL001 eval board to the other vibration meter. The eval board output (XOUT) was connected to an oscilloscope. I could adjust the rotation speed between 65 and 200 Hz (6500-12000 rpm) and thereby the the vibration level. The sensitivity of the ADXL001-70 at 3.3V is 16mV/g. On the scope I measured mVrms and divided by 16 to get the g_rms. At low speeds (up to 3 g rms) the g_rms were 6 to15% higher than the other vibration meter, that gave it's result in g_rms. At higher speeds (above 3 g_rms), the g error grew from +20% to +80%. I don't know exactly how ther other vibration meter calculates the g's but according to the spec it measures between 10-1000Hz, up to 20g, up to 200mm/s, up to 2mm displacement and it calculates the g rms on samples collected in 1 second.
- Is it correct to take the mVrms from the oscilloscope and divide it by 16 (mV/g) to get the g_rms value?
- what do you think of my test setup?
- Which sensor is more likely to give wrong values, the ADXL001 eval board (simply a ADXL001 and RC network) or the other vibration meter (which does some unknown internal calculations and displays the grms on the display)
- can you advice other affordable sensors to compare the ADXL001-70 to?