I want to measure the RMS value of a microphone's output voltage (to get the SPL value in dB).
I see a lot of RMS-to-DC IC seem to do that but I am wondering :
At a given moment, if I read the output of the RMS chip (on an ADC), is the RMS (the average/mean part) calculated on the last 10ms ? 100ms ? 1sec ?
In other words, I want to read the RMS value every second. So I want it to be representative of the last second that just passed not the last 10ms.
How is it related to the "settling time" parameter (if it is) ?
I didn't choose any yet. I want to grasp what's going on before ...
But if it makes it easier, let's say I'll go for the AD737 http://www.analog.com/en/products/linear-products/rms-to-dc-converters/ad737.html