I am using 2 WM8804 spdif transceivers interfaced to the 1452's Serin2 and Serin3 ports and routed through the ASRCs. The input ASRCs are configured to get signal from their respective serdata input pins, and output to dsp core at Fs=48k. Once inside the core the signal is routed to one of the other I2S outputs.
Using a standalone consumer spdif converter with analog in and 48k spdif out, I can go into the 8804 and through my dsp no problem.
However, using the spdif out from a Sony BD player with a standard audio CD, I am getting a bit of glitchy noise riding on top of my signal.
Examining the I2S signal out of the 8804 triggered from 48k LRCLK, I notice that the standalone converter is operating 24-bit. But the BD player spdif out appears to operating 48/16bit. I have made sure the settings on the player are compatible with 2ch pcm stereo and not dolby or dts. (Seems odd that the spdif out for CD source is 48k, but I presume they are using an ASRC to standardize their spdif spec.... but nothing published that I can find on this.)
Cirrus has given me some assurance that I have the receiver of the 8804 configured correctly and that its I2S signal is as it should be.
Nonetheless, we would like our product to be able to handle this situation. So in trying to sort out whether I have a clocking issue or a bit rate issue, here are my questions.
How does the 1452 handle a 16bit I2S input when configured for 24bit? Do I need to detect the word-length from the transceiver and modify the serial data register setting to accommodate changing bit rates, or should the dsp just handle it?
Appreciate any help you can offer.