I loaded the AD supplied 2G 100Hz hi res hex file and ran program with device flat on desk. This gave Z readings around 220.
Then I loaded the AD supplied 4G 100Hz 10 bit hex file and ran program with device flat on desk. I ran in one position then turned eval board over and ran more. I got readings in the +- 270 range.
If the 2G file was really using 10 bits then the 220 readings seem reasonable at 3.9mg/LSB but not real accurate (I guess - I don't really know what a few tenths off a full G feels like even with bare feet).
But the 270 seems very odd for the 4G 100H 10 bit. Seems like the count should be half of 2G range. So clearly I don't understand the meaning of the numbers coming out of the data logger. -4G to 4G = 8/1024 = 7.8mg/LSB Don't you just multiply this number by the data log output to get the measured G?