Hello, I am trying to understand a board, that contains an AD7476ARTZ. My question is certainly very low level, I am a physicist not an engineer, so I hope you don't mind my ignorance.
The problem is, that the digital output seems to be about twice as high as it should be.
VDD is 5.0VDC. Since it is a 12bit ADC, the LSB should be around 1.22mV, right?
The problem is, when I apply these voltages to VIN, I receive these results. (conversion repeated 1000 times)
|voltage at V_IN [mV]||mean ADC output|
Which looks like the LSB is rather ~0.6mV than 1.22mV.
The board is operational already (I just try to understand it) and the ADC is connected to a PLD, which in turn is connected to a PC via an FTDI USB controller. So first I assumed, there might be some misinterpretation of the serial output somewhere between the ADC and the receiving software. So I hooked up a scope to SCLK and SDATA.
yellow - SCLK
blue - SDATA
The software on the PC reported --> ADC output = 427 (12bit binary = 0001.1010.1011)
and this is also pretty much, what I would read from it by eye. One can nicely see the first 4 clock cycles, which should be always zero. And then the 12 bits resulting in a decimal of 427, which is the same as my software tole me.
So my conclusion is at the moment: There is nothing wrong with the communication (unless I am misreading the SDATA on the scope) between the PC and the ADC. So for some reason the resolution of the ADC has roughly doubled (which is a good thing ;-). I just don't understand why.
Can anyone give me a hint, what to look for?
Thanks a lot