I am using AD5664R as voltage reference to measure an array of resistors' value change (resistance sensors).
The basic idea is a modified Wheatstone bridge, instead of using a potentiometer to manually balance the initial bridge, I am using AD5664R to create a very accurate reference voltage to balance the bridge. The output of the DAC connects to inverting input of instrument amplifier AD623.
On the sensor array side, I am using ADG731 MUX to switch between each sensor: the common end (pin#43, D) connects to a current source (LT3092) and non-inverting input of AD623. S1 - S32 connect to each sensor (resistor) and the other end of the sensor connects to ground.
The sensors would change their resistance when pressure has applied on them. A MCU will open each channel of ADG731 (means one sensor has been powered), scan the output of DAC from 0 to 3.3V, when the MCU's internal ADC "sees" 1.5 volt from AD623's output, it will stop scanning and store current DAC value as "calibrated value". and then open another MUX channel to do the same. (I give 1.5V to pin#5 of AD623)
after all sensors are calibrated, then calibrated value will be restored into the DAC and I would expect 1.5V reading from ADC, my MCU code doing the following thing:
1. open one of ADG731's channel;
2. load calibrated value to AD5664R;
3. delay x micron seconds;
4. read ADC.
5. go back to 1 and open another channel and repeat until all 32 sensors been read.
The problem is when x is below 100, the all 32 sensor readings are either 1 or 1010 (my ADC has 10 bits), when x is about 500, i can get decent readings, like around 500. this 500 micron seconds delay really delays my sampling rate, what would be the problem? why do i have to wait so long... the sensor may have capacitance component.