My project (a fully-custom bass/guitar effects processor) consists of an ADAU1452 DSP, an ADAU1361 codec, and an STM32 microcontroller. Given the vast array of potential input signal levels (active vs passive pickups, direct input from the instrument or being placed in the middle of the signal chain, etc.), one of the goals is to allow the user to press a button and then play their instrument as hard as is reasonable, at which point the microcontroller will load up a program that will keep track of the maximum input level over a period of time and will adjust the gain on the ADAU1361 so that the peak input signal comes out to just below 1.0. This will allow any DSP effect I write for my board to work regardless of the input levels (as they will all be normalized), and should allow the effects to behave as expected.
The DSP program I'm using is shown below. The microcontroller loads the program, then begins the process by toggling GPIO 2 ("Active") from low to high. This begins the timer and allows the AbsMax node to begin recording the absolute maximum input signal level, which is stored in the Abs Max Value readback node. I have the readback node set up in the ADAU1452's Indirect Address Parameter Table to give it a fixed address that I can use to read the value from the microcontroller. When the timer elapses, it triggers a GPIO interrupt on the microcontroller which performs the readback of the Abs Max Value. After reading the value, the microcontroller flips the Active input low, waits for at least two sample periods (2/48000 seconds), and then sets it high again, restarting the process. The Abs Max values read back from the DSP are recorded and averaged together multiple times, and the result is then analyzed to determine how the ADAU1361's input gain needs to be modified to turn that max value into roughly 1.0. The program itself can be simplified a bit (there's no need for that ABCD node, for example - it is just for testing), but at the moment this is exactly what I'm using:
Now, this mostly works, but it does have some issues whose cause I'm not sure about, hence this post. If I do the sampling loop process entirely in Sigma Studio (using the Switch node to kick things off and pressing the Read button on the readback node), I get exactly what I expect: for a given input signal level, the abs max value hits a certain maximum and stays there. If I increase the gain, the abs max value increases accordingly. However, when I have the microcontroller play the part of me manually toggling the switch, things behave a bit differently.
Testing with a fixed-amplitude 1kHz sine way, and despite the timer waiting for a full second before triggering the interrupt, I am reading back values from the DSP that are significantly different each time I sample it, resulting in values that look like they are instantaneous (but still absolute) values rather than abs max hold values. The sampled values are higher and lower in what seems to be a sine-like fashion, rather than just giving me nearly the same value each time I sample it like I'd expect. At 48kHz, there should be more than enough samples to correctly capture the 1kHz sine wave input, no?
Sometimes, the first time I sample the DSP I end up reading back a zero, which makes me think that there's likely something wrong with the way I'm trying to do this activate-wait-interrupt-readback-reset loop. After the microcontroller's interrupt is signaled from the DSP, the max hold value is read, and the Active GPIO is reset, I figured that by waiting for at least two DSP samples it will have had enough time to read the GPIO input and reset everything. This seems to work, but it also feels a bit hacky. My microcontroller is running at 48MHz, and it's pretty easy to set up a timer that triggers after 2000 cycles, but there's still the issue that the DSP and the microcontroller are not synchronized in any way. This technically shouldn't matter if the DSP is performing the GPIO reads at 48kHz, but I'm not entirely confident this is working as expected.
Am I going about this in the correct way? Is there a better way to read back the value I need from the DSP? Why do I get different results when I use Sigma Studio than when I have my microcontroller initiating things and performing the value readback?
TL;DR: The values I get when pressing the Read button in Sigma Studio are different than the ones I get when I read from the DSP's memory with my microcontroller. Why?