Hello dear friends.
I am working with an ADPD4101 analog device. Suddenly there was a need to improve the accuracy of the sample rate. I found out that regarding the 10ms setup period, it actually happens every 11ms (may vary from device to device). At this stage, the board is ready and manufactured, so there is no way to connect the GPIO to an external clock source. I have implemented the LOW FREQUENCY CALIBRATION mechanism wich is given on page 22 of the datasheet. I used a delay of 1000000 ticks. By the end of the calibration process, I get the value 910000 from the STAMP_L register (0x011). To improve accuracy, I use the following equation: calib_value = 0x2B2 * (1000000.0 / 9100000) (which equals 0x2F6) where 0x2B2 is the default value in the OSC1M register (0x000B). After the update, the period has improved from 11 to 10.22 ms, which is still not good enough. I found out that if I write a predefined value (0x309) to the OSC1M register, the period becomes 10.02ms for a particular device, but not suitable for another. This is because the influence of the value is not linear (found out by testing).
The question is, which equation should be used to achieve the best performance?
Best regards, Genadi.
Moved to correct forum
[edited by: emassa at 7:44 PM (GMT -5) on 17 Nov 2022]