I am using the signal detection block in the ADAU1761, but cannot find any information in the SigmaStudio help, the engineering forum, or the Wiki pages about the actual formula used to generate the time constant value sent to the dsp.
On the ADAU1452 it is a simple matter of multiplying the time_const value by the sample rate, and sending that number of samples to the appropriate detector register.
However, on the ADAU1761, it is not clear what algorithm is being used. The 1452 method does not map to the numbers that show up in the capture window. I have made several attempts to back calculate the formula, but I am missing some key assumption about how the value is calculated. Wiki shows no actual formula, and does not list the different methods for different Sigma devices. I have had to resort to using a table based on the values in the capture window, but this is not very practical for providing user control.
For example, on the 1452, if I enter 2sec in the detection block, the capture window shows 0x17700 (2x48,000=96000 which converts to 0x17700... easy enough).
On the 1761, entering a value of 2sec produces 0x7FFE1B (8388123dec, or 0.9999422 converting to 5.23 format) which is clearly different from the 1452.
One observation is that the 5.23 formatted number get much closer to 1.00000 as the time value approaches its max of 200sec, so maybe that is a clue and it's some kind of percentage of the max, but the math doesn't quite work.
There is a forum entry that lists this issue as answered:
but alas that appears to only satisfy the questioner, and provides no actual formula.
Can someone please provide the actual equation that's used in the SigmaStudio app to generate these TimeConstant numbers? Or provide me a link to where I might find it?
This empirical formula is exact at both 2 and 200 seconds, while a bit high in between:
Param = 1 - 1 / [ (9397 x T ) - 1498 ]