I'm currently designing an audio system which is driven by an S/PDIF receiver. This receiver also generates the MCLK which is always 256xfs (e.g. 12.288Mhz@48kHz, 24.576@96kHz, ...). The ADAU is driven by this Clock and I've set the PLL mode pins accordingly (Mode0: 0, Mode1: 1).
Now to my question: When the sample rate of the S/PDIF signal changes I always assumed that I have to change the filter coefficents but the SR bits in the DSP Core register (0x081C) as well, e.g. if I have a 48kHz signal set them to 00 and with a 96kHz signal I would set them to 01. But after some testing it seems that I could leave the SR bits @00 and the DSP still worked fine (and my filter curves are also correct after updating the coefficients). Changing the bits to 01 doubled the corner frequency of my filters.
So why I don't have the change the SR bits?
I've read all the posts related to the sample rate but it seems I still didn't grasp the concept of the whole Clk generation. Any help here would be greatly appreciated.