its output? I can see at the
datasheet that for 0.1uF the turn-on time is about 80us but I don't have any
clue on how much this time increases in the case of the increased output
capacitanc. In my project the power supply is switched on/off
by a hardware power switch. So, I would like to know how much time we should
allow to pass from the time the supply voltage is applied until I am sure that
the output of ADR127 is stable enough.
The bigger the capacitor, the longer it takes to get to final value. When the
capacitor is totally discharged, it appears as a short, so you will get about
20-25 mA into the capacitor. As the capacitor voltage comes up, you will get
less and less current. With a micropower reference, the internal loop bandwidth
is not very high so that is another factor.
What is "stable enough” will depend on what the reference is used for, whether
for an ADC, a reference trip level, etc, as to whether or not you need to get
within 1% of final value, or 0.1%, etc.
As a rough cut, with dV/dt = I/C àC*dV/I = dt or
10.1uF*1.25V/1mA = 12.6 ms. (1 mA is a reasonable average)
You can look at one on the bench, see the time to a final voltage within your
desired error band, and then double it for a safety margin over process and
temperature variations. Also put a scope on your switch and look at switch
bounce time. Every time you go back to zero volts, the part resets.