I've tried a function with a ADAU1446 and AD1938 as ADC/DAC in 48kHz-24bits mode. The idea is to test the integrity of a signal from output to input.
the 10 output is in blue, it's the main output.
the 11 output is in yellow, it's an image signal of the tested output-to-input. The 1 input is directly wired from the 10 output.
the 20 output is in green, it's an image signal of the time adjust to compare.
finally, the GPIO is in purple and passed from 1 to 0 around the source at 1800Hz (but move sometimes...)
the source signal sweep from 20Hz to 20kHz
I thought first the signal must be identical with obviously a time shifted from output conversion to input conversion, measured around 1.2ms (at 48kHz : 57,6 samples, as 99,32% of 58 samples). But i don't understand the dimming of the green signal, is there not enough high sampling rate ? I tought the delay function was only buffering samples ?
Is there another function than "signal detection" to test the difference ? because the 2s delay (minimum) before reinit is loooooooong.
thank you for your answers !
After few more tests, i saw that the delay function instead of the fractional delay function works as i thought. With the help file, the fractional delay function is "fractions of a sample period via linear interpolation" so i think that if the period of the source signal is lower than the delay percentage, the output can't follow (?)
I will probably use the 'simple' delay function but not as accurate as i wish.
Answered and Closed