We use LTC2312-12 and supply CONV and SCK by FPGA.
Our environment setup: VDD : 3.3V V_REF: 2.054V (measured) OVDD : 3.3V (from our FPGA)
According to Page 14 Figure 11 in datasheet, our A_IN and output code follow the tendency when A_IN is under 1 V,
but when the A_IN exceed 1V , we found that our MSB of SDO didn't turn to 1 as A_IN increase and other 11 bits still follow the tendency.
Our setup follows the Page 1 Figure and the other specification.
Is their anything we need to concern about?
Do you have any ideas about the cause of this issue?
Can you provide a schematic of your ADC input driver and an oscilloscope photo showing CONV, SCK and SDO? Also please provide a photo of your setup.
Thanks for reply.
Our setup follow page 1 of the datasheet but add a RC filter (like Figure 10 in datasheet) on A_IN, the schematic is followed as figure (a).
The setup of our experiment are photoed as follow, the resistance and the capacitor is weld behind the circuit board :
In picture (b),
Red -- VDD : 3.3V with a 2.2uF capacitor connect to GNDPurple -- REF : measured 2.054V, also a 2.2uF capacitor connect to GNDBlack -- GND :Blue -- A_IN : connect to a RC(50 ohm, 47pF), the voltage come from VDD and the value is managed by using two variable resistance (both 10 kohm)
Green -- CONV : input from FPGA, t_conv is about 356ns in our experiment (can be adjusted) Orange -- SCK : input from FPGA, approximately 8 MHz in our experiment (can be adjusted)Blue -- SDO : output to FPGARed -- OVDD : 3.3V come from FPGA
All of our GND is connect to one place (show in picture (c)), including power supply, oscilloscope, FPGA, GND with each capacitor, etc.
We have also added a resistance(33 ohm) on SDO, but it didn't approve the MSB issue either.
The oscilloscope photo showing CONV, SCK and SDO are listed as follow:
In picture (d) and (e),
A_IN is supplied by 0.989V and 2.019V, respectively.Both condition should let the MSB of SDO be 1, but our experiment show that the MSB is always 0 while other 11 bits has the tendency of increment.
We also show the t_conv(356 ns) and t_2(56 ns) in picture (f) and (g)
Thanks for help!
To get good performance when using this ADC you really should have a PCB with a ground plane.
Using long unshielded wires for the digital signals between the FPGA board and the ADC board will result in ringing as you see in your oscilloscope traces. If this ringing is more than a diode drop above VDD or below GND you risk damaging the ADC at worst and at a minimum this will compromise the ADC performance. You should also have a low inductance ground connection between the FPGA board and ADC board.
The data sheet clearly states that the output impedance of the signal driving the analog input should be less than 50ohms. The 10kohm resistor divider you are using to drive the ADC is clearly well above this limit. I suggest buffering the resistor divider with an op amp buffer. The LT6230 or LT1818 are two op amps to consider.
Looking at your timing, the CONV signal should stay high for a minimum of 1.4uS before going low and starting to clock out the data. This will allow the ADC time to complete the conversion.
You might consider looking at the DC1563A-C which is the demo board for the LTC2312-12 for layout guidance when you make your own PCB or you may even want to purchase a DC1563A-C to evaluate the LTC2312-12 performance.
Let me know if you have any more questions.
Thanks for reply again.
We will try to approve our experiment environment.
But for the time of staying high of the CONV signal, we follow the datasheet of "2.5Msps Throughput Rate", which is different from the "500ksps Throughput Rate".
The former one said the minimum of t_conv is 247 ns while the latter one is 1.4 us.
How can we distinguish our part is 2.5M or 500k throughput Rate?
The LTC2312-12 maximum conversion rate is 500ksps. You would need to use the LTC2313-12 if you wanted to run at 2.5Msps.