We are having a technical problem with the AD7173-8. For some reason, when using the sync option, readings seem to be taking 34uS longer than the data sheet suggests. Do you have a technical resource who may be able to answer the question below? There is a logic analyzer output screen shot attached.
Using an output rate of 15625 SPS would be ideal since the conversion time (assuming it’s equivalent to settling time) is the slowest that could be used at a 4KHz logging rate after taking ADC clock accuracy and clock jitter into account (+/- 12.36% at 4KHz). The conversion time is stated to be 193 µs, but it ends up being closer to 227 µs, or about 34 µs off which makes it too slow for us. Using the 31250 SPS rate is fast enough for our process despite the actual conversion time of about 195 µs instead of 161, but of course we’d like to avoid the additional noise if at all possible.
Attached is a screenshot of 4KHz logic using the fastest ADC output rate (31250 SPS). The conversion time discrepancy (~34 µs) is the same for the few fastest rates.
We’re wondering what’s causing the additional conversion time on top of what we’re gathering from the datasheet.
For example, when using an output rate of 31250 SPS with the sinc5 + sinc1 filter, one channel enabled, and single cycle settling enabled, the datasheet specifies the settling time (i.e. what we think should be the conversion time) as 161 µs, but we’re observing an actual time of about 195 µs while using synchronization. This is shown in the previous screenshot by measuring the time between the low-to-high transition of the SYNC pin (telling the ADC to start a conversion) and the high-to-low transition of the DOUT/RDY pin (conversion completed). Is the correct way to measure conversion time?
Disabling synchronization seems to bring the conversion time down to datasheet specs, but then we lose control over when to take measurements. We could use the 31250(/6211) SPS rate to compensate, but first we want to figure out what’s causing the delay before assuming we need to use that output rate.
To sum up: What would be causing conversion time overhead when using synchronization? Is there a way to programmatically determine the additional time based on the output rate?
Thanks and Best Regards,
-Tim Starr on behalf of KH@MT