The LT3045 can be configured for customized current limiting by connecting a resistor from ILIM to GND. The output current will be limited to 500*0.3/Rilim, as the ILIM current is 1/500 of the output current and the LT3045 limits the output current so that 0.3V at ILIM is not exceed.
The graphs in the datasheet show the according short cut current that flows out of the output terminal when the output is shorted to GND.
How tight is the clamping to these 0.3V in a current limiting situation?
Like how far from these 0.3V away does the LT3045 start to decrease the output voltage?
Ideally, at an ILIM voltage of 0.2999999999999999999999 volts, the output voltage would still be within +-2mV of the SET terminal, but as the gain of the according amplifier comparing the ILIM voltage and the internal 0.3V reference has to be finite, the output will have to deteriorate from VSET+-2mV also at some point before 0.3V is reached.
Is there some kind of graph that shows this behavior?
Regards,GerdP.S.:I am aware that the current ratio of 500 has some tolerance, as does the 0.3V internal reference voltage (I assume it is similar to the VGFB Trip Point voltage range of 291 to 309mV, which would make the current ratio range from 468 to 539 in order to arrive at the +-10% tolerance of the programmable current limit as specified in the datasheet).This question is not about these tolerances, but about the behavior in the vicinity of the actual ILIM "tripping" voltage.
When ILIM voltage exceeds 0.3V, the internal comparator does need time to trigger the current limit circuit. However this delay, or can be named as "error", is quite small so that it can be neglected to figure out the current limit.
The question was not about time domain issues, but static behavior.
If one slowly increases the load current by decreasing the load resistor (without triggering thermal issues, so assume the resistor from ILIM to GND is 10k setting the current limit to 15mA) and monitors the output voltage, then the output voltage should stay at the voltage defined by RSET (say we have an RSET of 33k setting the output to 3.3V) as long as the load current is below 15mA (RLOAD greater than 222 ohms). As soon as RLOAD becomes smaller than 220 ohms, the LT3045 will limit the load current by reducing the output voltage to RLOAD times 15mA.
Ideally the voltage would be 3.3V if RLOAD is 220 ohms and 3.285V if RLOAD is 219 ohms and so on. With an RLOAD of 1 ohm, the output voltage should be 15mV driving exactly 15mA into the load, resulting in an ILIM current of 30µA (15mA/500), which generates 0.3V at the ILIM pin, just the same as the 30µA ILIM current when RLOAD is 220 ohms.
BUT this would mean that the gain of the internal amplifier, which reduces the output voltage once the ILIM reaches (exceeds) 0.3V, has to be infinite, which is impossible.
If this "gain" is, for instance, 10000, then in order to be able to reduce the output voltage from 3.3V down to 15mV (with the 1 ohm RLOAD), the voltage at the ILIM pin would have to rise (3.3-0.015)/10000, which is 328.5µV. So either the current at 1 ohms will be 15.016mA when it is 15mA with an RLOAD of 220 ohms, or the current with 220 ohm RLOAD will already be limited to 14.983mA when it will be limited to exactly 15mA at about short circuit to GND.
An error of 0.1% does not sound very high, but firstly this is about a regulator that is used in precision circuits, and secondly we do not know if that gain is really 10000 or maybe only 100, making the error more than 10%.
Now exactly this is what the question is about.
With regard to static error, it makes more sense thinking about the part-to-part variation other than figuring out the error for one particular part. You cannot even get the accurate current limit resistance due to the measurement instrument accuracy. With that being said, you don't need to worry about the bandwidth of the internal error amplifier in your design.
Look, I can easily measure the current limit by measuring the short circuit current. that allows me to take care of the part-to-part variation easily by just connecting a DigiPot parallel to the fixed resistor from ILIM to GND.
Also the part-to-part variation is discernable from the datasheet as mentioned in the post scriptum of my original posting, where I have also clearly stated that this is not the issue here.
What I want to know is like if I measure a short circuit current of 500mA at an RILIM of 300 ohms (as specified as typical value in the datasheet), when will my output voltage (of, for instance, 5.0V) begin to drop. It's specified that the outpot voltage drop is less than 0.5mV from 1mA to 500mA under "Load Regulation" (I assume with only internal current limit of about 700mA).
If I set RILIM so that the short circuit output current is limited to exactly 500.001mA (yes, I would very probably have to trim RILIM), then I assume the output voltage would drop by 0.5mV a bit before the current reaches 500.000mA.
Do you actually understand my question and just pretend to not understand it because you have no answer for it, or do you really not get the point?
The definition of load regulation is the output voltage change with different load before the part hits the current limit. For example, when load current is 1mA, VSET is 1V, and VOUT =VSET+VOS. When load increases to 500mA, typical ISET change is 3nA, and typical VOS is 0.1mV. Overall output voltage change when load current changes from 1mA to 500mA is (delta ISET*RSET+ delta VOS).
The internal op amp gain is a different story. The internal comparator of the current limit has little voltage difference between the ILIM pin voltage (positive side of the op amp) and the 300mV internal voltage reference (negative side of the op amp). It is so little that you don't need to worry about its effect in the application circuit.
BTW all data we have are published in the datasheet.