What I'd like to accomplish is to clock the ADAU1701 on-chip ADC and DAC at a 96k rate and decimate/interpolate by a factor of 2 to run all the signal processing at a 48kHz rate. The goal is to use the higher DAC and ADC sample rate to reduce the latency. However, our algorithm uses 694 DSP instructions and therefore is too complex to operate at 96k, and we are for now married to the ADAU1701, which doesn't have decimate/interpolate blocks. Our system bandwidth is under 8 kHz, so simply ignoring every other sample should work fine.
An ADI document hints at how that can be accomplished:
"For example, if an ADAU1761 was set up to run at 96 kHz and the program shown above was downloaded to it (with 667 instructions), then the effective rate of the system would be cut in half, since only every other sample would be processed by the DSP (with alternating samples being ignored entirely)."
So my question is how this can be done on the ADAU1701, for example, if the signal processing chain is ADC followed by a filter then output to a DAC. Could I simply set the ADAU1701 ADC and DAC sample rate to 96k by adjusting the "Program Instructions" setting to 512, and have our DSP program execute its 694 instructions on every other sample?
If anyone could weigh in on this, it would be appreciated.