hi guys

I need your help. I designed a digital filter by Matlab and load the transform function parameters into the ADAU1701 board.The curve simulated in sigma studio is right，but I test the board use a spectral analysis SW and the phase shift.

The first figure is simulated by sigmastudio and the second one is test by spectral analysis SW.

Is the phase shift due to the latency of the board,which is between the ADC and the DAC? How can I solve this problem and get a good curve.Thanks for your help.

Hello,

The phase lag you're experiencing occurs within the ADAU1701's converters themselves. The ADC includes an anti-aliasing filter, the DAC has a reconstruction filter. Both are digital filters which impose a total delay of just over 1 mS at the standard 48 KHz sample rate. In turn, this adds a frequency-dependent phase lag of:

phasein degrees =360xTdelay xfExample:T = 1.08 mS, f = 1 KHz, thenphase lag = 390degrees. This directly adds to your filter's phase shift.To accurately measure your filter's phase lag, you could add a second signal path from input directly to another DAC without filtering, to use as a reference. Have your analyzer measure the

differencein phase between the two paths.On the other hand, if your application requires near zero analog-in to analog-out latency in real time, the ADAU1701 won't work for you. See if the ADAU1772 is suitable -- it features a converter in-out lag of

38 uS, but offers only limited functions. This chip finds use in noise-cancelling headphones.Best regards,

Bob