Post Go back to editing

About ADAU1452 IC to Detect Input Signal Distortion

I want to know if the signal is distorted in the DSP after the signal is input into the ADAU1452, how can I achieve it?

  • Hello,

    Could you please tell us a bit more information like what is the signal type, where the input signal is coming from and the external processing ?

    Regards,

    Harish

  • Hi:

    The type of signal is an analog signal, which is first converted into a digital signal by the ADC, and then input to the ADAU1452.

    Regards,

  • Hi,

    Your question seems a bit general. So tell us something like what is the actual signal ? Is it a sine wave or a broadband music signal? What type of distortion you are looking to detect. Is that like a clipping from ADC?

    Regards,

    Harish

  • Hi:

    The input signal is a sine wave, and we can think that the signal is distorted when the sine wave is clipped.

    Regards

  • Hello chenxuezong509

    This problem is actually difficult to solve. It is really difficult for any audio or music but a sine wave helps but is still difficult. 

    I put together a project to show a couple of ways to do it and the limitations. 

    You can probably get around this by using a microcontroller to filter out false positives and look for clipping that goes on for a longer period of time. 

    Clipping is when the signal stays at the same level for a number of samples. The problem is that the higher the frequencies the fewer samples there are during one period of the sine wave so the fewer samples that could actually be at the same level. At 48kHz fs a 3kHz signal only has 16 samples to describe the wave. So a short clip may only be for one or two samples. 

    So I made one project that takes the difference of a number of samples. When the sum of all these differences equals zero then the signal is clipped. I also put in a test for a signal level that is at least -50dB or higher so the noise floor, or mute, would not trigger it. 

    The block will output a "1" when there is clipping so I bring that output to a pulse counter that will count how long it is high. 

    This method will work to only about 850Hz since wave when all the 8 samples are summed. I put in mutes to use fewer samples. If you mute all of the mutes I put in the project then it is only using three samples. With the mutes engaged the frequency goes up to around 2900Hz. Anything above this is difficult with this small level of clipping. The clipping is just too few samples. 

    The second method is simply a level threshold. So this is if you know exactly where the max level or clipping level of an ADC will be. Then you can set this threshold to this level or just below it. This block also outputs a "1" when the signal goes above this level and then I count the number of samples it is high.

    In this project I have a sine wave going to a clipper set to clip the signal at a level of 0.9. So a full scale sine wave that goes to +/- 1 will be clipped to a level of 0.9. 

    I hope this helps. It is an interesting problem. 

    Dave T

    ADAU1452 Clipping Detector.dspproj