during my activity in this post,
I experienced the need for an oversampling block in Sigma Studio. The problem arise when you model non-linear
static transfer functions inside the dsp for example with a LUT, something like that:
When you pass your audio signal through this LUT new harmonics are added to the signal and so the risk to fall into
aliasing problems. My solution is to limit the bandwith of the signal before the LUT, but I think this is not always possible.
A better solution would be to run this LUT at an oversampled period. This is commonly implemented in code with a for loop
which calculates the function iteratively N times at each sample. Obviously this requires more DSP instructions.
What do you think?