I'm using ADL5304 and trying to measure a 100 Gig ohms resistor at 5 volts, it is setup with a single supply, I'm using the default pinout referenced on page one (figure 1) of the data sheet. I have to increase the voltage (increasing the current) to start measuring some current, the other issue is that it oscillate in the 200-800 Khz before it conducts current and start measuring. once the current get to around 100nA, it can measure .2 volts per decade.
Also trying to calculate the output current, is there a simple formula to use?
I'm new at this, any help would be appreciated.
If single-supply, the VSMx should be 1.5V, so the current through resistor is (5 - 1.5) / 0.1T = 35pA, agreed?
Based on your description, it sounds like there might be some large leakage path…
Solving Eq. (2) for Inum:
Inum= Iz * 10^(Vlog / Vy)
Based on your description, it sounds like there might be some large leakage path to ground at input pin INUM. Remember, the INUM pin tries to self-bias at VSMx voltage, usually 1.5V. Any leakage path from INUM to ground will subtract from the actual input current. Does the PCB layout incorporate guard traces? See datasheet section entitled "Leakage". Also refer to the factory eval PCB figures in the User Guide UG-339 for a recommended PCB layout.
If IMON pin function is not used, it should be grounded (ref. datasheet page 21).
VLOG output comes from an op-amp output, so make sure there is not much capacitive loading on VLOG. Many op-amps are known to oscillate with capacitive heavy loads. If capacitive load cannot be avoided, the cure is simply to add series resistance before the capacitance.
Also make sure input capacitance on INUM pin is not excessive, as this could cause too much phase shift around the feedback path.
Regarding last question, 'calculate the output current', do you mean input current? If so, that would be datasheet Eq. (2), solved for Inum, where Vy= 200mV/dec, and Iz= 3.162fA.
Thank you very much for your reply and help, Unfortunatly we have Guarded INUM with Analog Ground, the self-bias voltage is 1.02 volts, the input is connected to series of analog switches measuring currents from few sources, anywhere from 100 GOhms to 2 mA. the load is capacitive and limited by series resistors to regulate current at different voltages.
The output is connected to AD7779 DAC, I have pins 31 and 32 shorted together and filtered with .1 uf to ground and 220 pF to INUM, INUM was oscillated in the mid to high KHz. Also, Pins 9 and 10 shorted with .1 going to ground.
Sorry, yes, input current, it seems that it will not return correct measurements at GOhm Range, is there a way to save these boards? I have produced 550 of them.
Thank you again for your help.
I wonder, is 1.02V what you measure, or is it the voltage you apply to VSMx pins? If the former, be very careful when measuring voltage on INUM pin. Most voltmeters have too low of an input resistance. In other words, the voltmeter itself will require enough current that a substantial error could be introduced just by touching the voltmeter probe onto the INUM pin. To accurately measure voltage at INUM, you need a special electrometer type instrument, or a HP 34401 with the super high input resistance option enabled from the front panel (it's buried deep in the front panel menus).
If INUM is guarded by analog zero-volt ground, there may be no way to save the boards, unless you change over to the split-supply configuration, with VSMx = ground.
From the circuit description, it's not clear what is connected to the INUM pin. The only things connected there should be the input current source. Nothing else. No capacitors to other pins of the ADL5304.
Inum formula: answered in the next thread.
Thank you again for your reply and help, I was wrong about the 1.02, it is actually 1.5 volts, apologies. I have attached a copy of the schematic, I had to put a 220pF cap to Inum to reduce much of the oscillation but introduced a large delay in settling time.
Also, you suggested split-supply, will that allow me to measure in 1pA range?
Looking over the PDF schematic, I notice R19= 4.02k. This should be 4.02 Ohms, not kOhms. This might well be the root cause of the oscillation.
Also I notice a problem with R20. If negative supply is used, this should be a ferrite bead with 75 ohms reactance at RF frequencies. The DC resistance is nominally <1 ohm. If VNEG is not used, a zero ohm resistor could be used here. Under no circumstances should a 75 ohm resistor be used, because there will be an excessive IR drop across it. Sorry the documentation is not more clear in this regard. As shown in the EVB User Guide, this part is a MuRata BLM18BA750SN1D.
Hopefully this should fix all the oscillation issues, and the 220pF feedback capacitor can be reduced or eliminated. You might need to add some R11 resistance if the input wiring has a lot of capacitance to ground.
Also, be aware that long wiring on the input can cause line frequency noise to creep in. If allowed to become excessive, error increases, because the log function is asymmetrical, and the noise does not average out without introducing DC error.
Regarding the split supply question, and does it allow 1pA range? Answer: The device measures down to 1pA with either single supply or split supply. Accuracy does degrade with respect to temperature at this low current, as indicated by datasheet Figure 6. What the split supply does is 1) it lets you bias VSMx= 0V, which is convenient for a lot of applications, and 2) it guarantees accurate operation at -40 deg C and max 10mA input current condition.
Thank you again for your reply and suggestions, R19 is actually 4.02 ohms, error on the schematic, sorry.
LogAmp is configured for a single supply, there is a zero ohm resistor in C12, I put 100 ohms in R11, not much changed.
LogAmp's output is directly connected to AD7779 ADC, actually 8 of them each connected to an input of the AD7779, I noticed that every time the ADC is triggered via SPI Bus, the output of the LogAmp dips, please see attached picture, there is no resistor between the output of the LogAmp ns the inout to the ADC. Any idea?
Now it seems a new problem, the output voltage changing depending on load. ADL5304 datasheet page 3 shows buffer output impedance <2 Ohms. That's a small-signal measurement, and what you're documenting is a very large-signal phenomenon, 1 V. drop. So you might want to also probe both power rails right near the IC, VPOS and VNEG. It's fairly normal to observe a narrow glitch upon ADC sample, but a 1V drop means something is really very wrong. AD7779 input current is in the nA range, so the steady-state drop when sampling should be unmeasurably low. You might consider disconnecting the ADC while testing, get the ADL5304 completely debugged, then reconnect the ADC. Hope that helps!
Well, the Analog ground and Digital ground had issues and are now fixed, the drop is not happening anymore, thanks for the heads up and pointers.
I have several resistors for calibration and verification of accuracy connected, starting with 100 GigOhms through 10 KOhms, is there a way to use ax+b slope and intercept to calibrate the output or since it is Logarithmic it is not possible?
Any suggestions on mathematical calibration?
Thank you, Shawn
The log function plots as a straight line on semi-log graph paper, like datasheet Figure 3. Therefore straight-line techniques can be applied if x-axis is treated as log(x), not x. In the datasheet, the straight line approximation is characterized by specifying slope and x-intercept. Let us call these two variables m and xo. Your calculations of m and xo should align fairly closely with datasheet m and xo.
For example, pick 2 measurement points A and B along the straight line transfer function like shown in datasheet Figure 3. Each point has output voltage Va and Vb. These correspond to known input currents Ia and Ib.
The 2-point slope = m= deltaY / delta X, where deltaY= Vb - Va, and deltaX= log10(Ib/Ia).
The 2-point x-intercept = xo= Ia / 10^(Va/m).
If these slope and intercept calculation align with datasheet, check your other intermediate cal current values to make sure they follow the equation from earlier in this discussion.
If you really want to get precise, you could do multipoint cal using all your known datapoints. That would be linear regression in log domain. It's probably a little too complicated for this forum.
Just FYI, I'll be offline for 1 week. Returning 7/6.
This is very helpful, appreciate your help and support, I'll follow your advice and let you know what happens.
Have a safe and healthy week off.
I hope all is well with you, I had a question regarding voltage at the input pin 4, is there a way to eliminate that voltage and still read the log current? like a normal Amp input?
I think that I stated my question incorrectly, we would like to use external current Bias instead of the internally supplied bias, is it possible with this device? our external range of current is 10 pA-3 mA? appreciate your help and reply..... Thank you.