I am developing a module based on the ADAU1701 - the circuitry is essentially a simplified version of the evaluation board. The intended sample rate is 96kHz using the internal ADCs and DACs.
However, I have found it impossible to get correct results using the formulae given in the datasheet for calculating the ADC input and current-setting resistors at sample rates higher than 48K. Another user reported the same issue around a year ago in the thread "1701 input noise", but no definitive explanation resulted and I am a little surprised that the subject has not come up again.
The datasheet states that the total value (including 2K of internal resistance) of both the input and current-setting (ADC_RES) resistors should reduce linearly with sample rate. Using the formulae given results in external values of:
@ 48K, ADC_RES resistor = 18K, input resistors = 7K (for unity overall gain)
@ 96K, ADC_RES resistor = 8K, input resistors = 2K5 (for unity overall gain)
@ 192K, ADC_RES resistor = 3K, input resistors = 250R (for unity overall gain)
Adopting those resistor values at 96K (and with the inputs linked directly to the outputs in SigmaStudio) results in a noisy, jittery output with large amounts of both waveform and zero-crossing distortion. Routing a tone generator directly to the DACs in SigmaStudio resulted in a clean output, thereby excusing them of blame and casting doubt on the ADC setup.
Replacing the fixed resistors with 20K potentiometers, I evaluated the performance with various combinations of resistance and sample rate. The 18K value for the ADC_RES resistor (as implied in the old thread) actually works fine for all sample rates - the variation in overall gain is within 0.2dB for sample rates between 48K and 192K and there seems to be no noticeable change in distortion performance.
As this resistance is reduced, the overall gain falls somewhat, indicating that the full-scale input current is increasing, but if the value is decreased below around 13K (at which point the gain has dropped by around 2dB), the distortion increases massively as previously observed. Again, this behaviour is largely independent of sample rate.
Leaving the ADC_RES resistor as 18K, I then adjusted the input resistor to give unity overall gain from input to output and obtained the (nicely convenient) value of 5K6. This indicates that the datasheet formula for the input resistors may also not be entirely accurate, and double-checking my '1701 evaluation board did indeed show that the gain there is also somewhat lower than predicted by the formula.
So I have ended up with a working system, but have had to ignore the datasheet in order to do so - not a situation which inspires confidence. My questions are:
- can others confirm the behaviour I have observed?
- have I misunderstood the datasheet, or is there genuinely a discrepancy between it and real-life behaviour?
- if there is, can anyone from AD provide any insight into what is actually happening and how to properly optimise ADC performance at all sample rates?
Thanks very much - I hope this at least helps anyone who is struggling with the same issue.