AnsweredAssumed Answered

Long delay after a low pass filter ... what are the delay limits?

Question asked by dzingoni on Mar 20, 2013
Latest reply on Mar 21, 2013 by BrettG

Hi everybody,

I have a strange behavior in a simple workflow where, on an adau 1701 at 44100, a delay is put after a low pass filter. Nothing strange, it's a part of a more complex application, I tried to isolate the problem.

Now, what's happening is that I MEASURE the output (I have a tool that displays impulse response and frequency response in real time), and see

that when the delay is higher than a threshold the output actually disappears. I'm attaching a couple of images to explain.

workflow.png

The workflow is simple. I use digital input 1 to feed data, and monitor it on the OUT 0. The tool is actually an MLS analyzer measuring in real time, and as all data is digital the responses are clean and allows to understand well what's happening. The response of the filter is OK, it's just a low pass, and from the snapshot we can't see that the samples are delayed but we can trust the system.

LowDelay.png

Now what I do is changing the delay.

As far as I understand the maximum delay is limited by the amount of memory available to store the delay lines.

In my case, with an Adau 1701, I should have 2048 locations of Data Ram. I actually use a maximum of 980 (maximum delay

in the Delay element). Some more ram is used by the other blocks, but I should be well into the limits.

This is confirmed by the "Compiler Output.txt" file that is generated by sigmastudio ....

 

################## Summary ########################

  Number of instructions used (out of a possible 1024 ) = 45

  Data RAM used (out of a possible 2048 ) = 1000

  Parameter RAM used (out of a possible 1024 ) = 10

 

So I think I use only 1000 locations out of 2048.

 

Now, If I increase the actual delay, int the workflow picture above it is actually 497, what I get is that the measured signal disappears.

In the case above I noticed that 497 is the threshold: below 497 everything is OK, from 498 on the signal disappears.

In a more complex workflow that I was using this threshold was far lower, around 100 samples of delay (despite having still Data Ram available, I checked it).

 

What I don't understand is what is the limit of delay that I can get from a 1701. I supposed that the limit was the amount of RAM, so a theoretical

2048 minus the locations used by other blocks, but in practice I see that this is not true.

Is there a bug in the compiler when dealing with long delays? How can I estimate what is the limit?

Can anybody check this behavior (filter before or after a delay) and confirm it or find where is the fault?

Thank you.

 

I'm attaching the project if anybody wants to try. Modify your I/O as appropriate. One final note: don't trust the "Probe" frequency response, it's always OK. It's the actual measured behavior that changes. Maybe you can "listen" .... Message was edited by: Daniele Zingoni

Attachments

Outcomes