I'm using the AD5933 in a circuit very similar to EVAL-AD5933 where the excitation signal is rebiased and applied to the impedance.
I've been collecting data for some time now, and I've noticed an oscillation (around 0.45 Hz) in the impedance codes (Real and Imaginary) when the absolute value of either is very low. This is significant as it affects the phase readings noticeably, and it does affect the impedance magnitude measurement also - not as much as the phase measurements.
This oscillation is most noisy when the no of settling time cycles is zero. For the measurements shown here, a resistance is measured at a constant frequency of 1000 Hz. The frequency increment is used to measure 200 times (delta frequency is set at 0 Hz to achieve constant frequency). This effect is prominently seen when incrementing the frequency to collect multiple data points (eg 200).
The graphs above show the code stored in the Real register when a purely resistive impedance of 130kOhm is measured (The real code is close to zero as the system has a phase of 87 deg). The x-axis represents time in milliseconds. When the no of settling delay cycles increases, the measurement points become gradually further apart (expected). Therefore not all 200 points are displayed in this graph.
The above one shows all of the 200 data points measured with settling time delay cycles set to 32 (resistor 130kOhm).
A similar oscillation is seen on the EVAL-AD5933 board as well. Shown below. I'm unable to determine the frequency of this oscillation as the data from the EVAL software does not have timestamps for the readings.
My aim is to build a very accurate capacitance measuring circuit and I've managed to achieve repeatability of ±1.5pF.
However if this oscillation can somehow be accounted for, the repeatability can be made better.
Is anyone from ADI is aware of this behavior?