For a peak compressor block, I'm measuring with a scope the actual release times for different values of decay entered into the Decay (dB/S) parameter.
Compressor attenuation is set for -20dB. Up to values of about 400 dB/sec decay, total release times measured on the scope match the expected 20dB x sec/dB setting (e.g. about 50mSec at 400dB/sec). However, for Decay setting greater than about 400dB/sec I see no further decrease in total release time.
The capture window does show that different values are being written for each decay setting.
Any thoughts on why this is occurring?