I am using the 16-Bit Differential ADC AD7687 in my project and I have some troubles to read out the correct values. The ADC is powered by a 5V reference voltage which is not optimal but I have no other choice. Then I use 3.3V as I/O voltage for a connected microcontroller. So far I managed to configure the SPI on the microcontroller to initiate a conversion according to the datasheet in the CS mode 3-wire with busy indicator. It is triggered by a pulse with >20ns on the CNV line. and then I am waiting for the falling edge interrupt on the data line to start the reading. I analyzed the timing and the received data with a logic analyzer (80Mhz) and sometimes it seems to be okay but I frequently get wrong values which are about 0.3V-0.4V higher. There are also two pictures showing the measurement results. The first snapshot shows the correct value, but I am not sure about the varying voltage level on the SDO line during the conversion. It´s also totally random and it is not related whether the result is correct or not. I am also confused about the 17th or 1st bit on the SCK mentioned in the datasheet.
I have no clue how to solve the problem, so I hope you can help me.
Some other issues:
How disadvantageous are the effects of using the same source for reference and supply voltage? And is it a huge problem to supply first the VIO and Vref before applying VDD?