I've got a system with an AD9250 which is being driven by an HMC1023. The HMC1023 is driving a 330 ohm load across the ADC input through a pair of 40R resistors (i.e. one in each arm of diff pair giving a total load of around 400R). At the ADC input we have some shunt capacitance and some series 10Rs into each input pin to reduce peak transient current. The input is biased at 0.9V.
The datasheet states that the maximum voltage level on any of the AD9250 analog input pins is 2.0V (AVDD+0.2) referenced to AGND.
Although at first I believed that the HMC1023 was only capable of driving a 2Vppd signal into a 400R load (per the datasheet which would limit our max ADC voltage to well below 2V), it appears that if the input power keeps rising, the voltage that we see at the output of HMC1023 can reach a heavily distorted square-ish wave of 3-3.5Vpp in EACH arm of the differential pair. This is clearly in excess of the maximuim voltage of the ADC input. I wanted the system to be inherantly safe such that the driver stage was not capable of blowing up the ADC.
I was wondering if it would be possible to have any further information on the reason for the 2.0V limit on the Ain pins and, if it is due to the conduction of protection diodes for example, what the maximum allowable current would be into this input as we might be able to protect it by increasing the value of the series 10Rs. If is not a protection diode limit, any information on whether the described situation presents a genuine threat and any possible means of mitigating that threat would be most appreciated.
The AD9250 is an un-buffered ADC. The link below describes the failure mechanism for an over voltage condition as well as some suggested input protection circuits.
Thank you for the application note.
Based on the solutions provided in the note, it would be challenging to find a schottky diode that is going to enter revese breakdown at a low enough voltage to clamp the input at less than 2.0V.
Since the damage mechanism is either Vgs or Vds breakdown in the ADC input, if the input can be current limited (i.e. with series resistance between the driver and the ADC input) is there a tolerable amount breakdown current that will not result in damage?
I agree that application note does not provide solution for a 2 V clamp.
Perhaps the attached method will work where a low cost CMOS regulator is used to set a bias level of around 1.2 V of which a schottky diodes can be used to clamp the voltage once it starts exceeding 1.4 V on either leg. A variable regulator was selected such that one could optimize setting (distortion vs clamping level) but perhaps 1.2 V will work well allowing for fixed 1.2 V regulator type. The regulator has a resistive load that is used to sink about 50 mA of current to maintain regulation during an overvoltage case (which sources current into regulator output).
Hope this helps.P.S. Perhaps you can sample the Skyworks diodes and dead-bug on to your existing board while using an adjustable supply attached to cathode common to figure find out what voltage is optimum.