I'd like to get some support regarding Linear's LT8410-1 boost converter.
More specifically I am measuring a large disrepancy between simulated current consumption (using LTspice) and the reality of my circuit.
To be more precise the boost converter is set to convert about 3.4V to 13.25V or 26.5V. I am measuring about 150uA current drawn from my supply in the first case and about 3mA in the second case. LTspice simulates an average current draw of ~130uA in the first case and about 250uA in the second case.
The situation is a bit more complicated but I would like to discuss this in private if possible with one of the support engineers.
Thank you for any support.
I have received it now.
Your simulation runs at about 650KHz when the output low (about 50V), and the switch current peaks at about 3mA with about 10% duty.
When the output is high, the frequency shifts to about 1MHz, and the switch current peaks at about 8mA with >50% duty.
Seems normal to me.
These results might not be exactly as the circuit on the bench, but are consistent with what i think should be happening. As you increase the output, even at no load, the converter has to work harder to charge all the caps to the higher level. That becomes more difficult with low input voltages.
Please set your simulation for 100Vout, and increase the input to 8V. Then notice how the frequency shifts down to about 10KHz, the peak switch current drops to about 4mA. That yields a duty cycle of <1%, for a very low average current from the battery.
I think your application is working as expected.