After reading UG-364, I was inspired to use the AD5933 chip only for integer period sampling of the possible test frequencies.
I.e. with a system clock of 16.776 MHz (internal clock) the lowest test frequency possible is 1024 Hz at which frequency, I'd be sampling a single period with the 1024 samples coming from the 1MSPS ADC.
In the attached plot, I measure in the frequency range roughly 1Khz to 10kHz on a 1000 Ohm and 2000 Ohm resistor and show the raw DFT norm normalized by the resistor value. As seen, it makes good sense to do integer period sampling (here the green and red curves coalesce), however, for a single period, the result is erroneous. Can anyone explain this? The conclusion from the experiment would seem to be that integer (>2) sampling of periods gives a good calibration and if non-integer period sampling is used then it is important to have many signal periods (say >10) in the time sampled by the chip.
Another question is that in my mathematical analysis it should suffice to do a sampling of an integer number of half-periods. Obviously, that is not the case in the experiment. Any comment by anyone who has studied this would be very welcome.