I've designed a Galvanostat (meassures the voltage drop vs time in a sample while applying a constant current) and the final ADC conversion shows some extrange behaviour. Instead of showing a continous variation in the voltage meassured, it show a ramp-like conversion. Meaning, the conversion is constant for a few samples, providing the exact same 16bits value, and then jumps to the next voltage value. This behaviour is more visible as the rate of change in voltage lowers.
I tried changing the ICs in the circuit, changing the experiment conditions and many other things, but the problem seems to be in the way the ADC reads the data.
I was using a multimeter to read the in voltage and, strangely enough, while the multimeter was connected, this behaviour disappeared and the conversion went as expected, but a bit of noise was added. I suppose that the multimeter pressence stabilized somehow the ADC reading, but i wasn't able to recreate it on my own.
Here are some images of my circuit and the curves I got from different same-conditions experiments. The one in black is meassured with the multimeter connected. It is of note than the other three show the exact same "stairs" at the same voltage values.
Can anyone please give me a hint of what may be happening?