I am working on a design that uses the LTC4364 to clamp transients of up to 60V down to 40V. A 10uF (+-10%) timer cap is used, for a timeout period of 1.06s.Itmrup,OV = 2uA + .644uA/V * (60V - 40V - .5) = 14.56uA
tOV = 10uF*1.25V/14.56uA + 10uF*100mV/5uA = 1.06s
The LTSpice sim results match the calculations, but measurements from actual boards have a timeout of 1.84-1.9s.
The increased time is not a problem, it is still well within the SOA of M1, but if the time decreases by a similar amount on future boards, it will fail tests requiring it to continue operation during transients. Can this large of a difference be a part tolerance issue, or is there something else I should be looking at?
(I can make an edited schematic and post it as well if that would be helpful)
The variation from 1.06s to 1.9sec does not seem likely due to any variation of the part. It will be better if you could attach edited schematic so that we can have a look and give better insight of what might be causing the issue. The 10% variation of Cap and variation in timer pull-up current variation should give 1.2 - 1.3s worst case.
The schematic is here:
I discovered the answer right after posting the question yesterday. The current consumption of the LTC4364 and the pullups that are tied to Vcc create enough of a drop over R6 that Vcc is only about 48V when the input to the filter is 60V.
The pullups should have their own clamping diode and current limiting resistor, and not share it with Vcc of the LTC4364 to fix this problem.