I'm attempting to use one ADC of the ADuCM360 to sample three input channels in a continuous sequence, and the conversion time I'm measuring does not match my calculations; I'd be very grateful for some guidance.
The ADC setup is as follows:
Chop enabled, RAVG2 disabled, SINC4 disabled, AF = 0, NOTCH2 enabled, SF = 16
By my calculation, that equates to fADC = 152.63Hz, tSETTLING = 13.36ms, and time required for one conversion 6.55ms.
ADC usage is as follows, where N = 3
1) Setup for channel, setup DMA read of N samples with ADC in continous conversion mode (on the assumption that the settling time equates to two samples, and that I should gather three samples, the first two I will discard as they are within the setting period, and the third I will keep)
2) On DMA read complete interrupt, goto step 1 and select the next channel and so on.
To measure conversion time, I set a port pin high when the ADC conversion is started, and low on DMA read complete interrupt. Using this method, I measured the time elapsed when N was set to 1, 2 and 3:
N = 1, 13.2ms (Expected time for one conversion, 6.55ms)
N = 2, 19.6ms (Expected time for two conversions, 13.36ms)
N = 3, 26.4ms (Expected time for three conversions, 19.65ms)
What is happening here? In the abscenece of any further information, my plan was to try single conversion mode and see if results are the same, but it seems like more effort as I'd have to manually gather my three samples for each channel one at a time, unless I've misunderstood something fundamental about how the settling time is taken into account.