ADP1051 CS2 CURRENT SENSE

Hi Sir:

1\ In ADP1051's datasheet, page 6, test condition for CS2 current measurement sense accuracy is "4.99 kΩ (0.01%) level shift resistors"; is that a typo? it's impossible to use 0.01% precision resistor in real application, right?

what's the accuracy for 0.1% resistor used;

2\ the CS2 current measurement sense accuracy for low side mode is +/-2.28mV(0.01% resistor); for 1mohm current sense resistor used in the ADP1051 reference design, how to guarantee the output current detected precision of less than +/-1A? 2.28mV/1mOhm=2.28A??

  • You can buy resistors like that off the shelf.

    Just search on DigiKey, Stackpole part number RNCF0805TKY4K99, $1.56 in 100s.

    The point of the test condition circuits is to show how the part is being tested.

    It is meant to highlight the performance of the part, not that of the parts surrounding it.

    You will always have to do a tolerance analysis of your final circuit to see if it meets your requirements...

    Klaus

  • Hi Klaus:

    Thanks, i got your point;

    For question 2, how to calculate the CS2 current measuremen accuracy of +/-1A? In eigth brick reference design? And how to guarantee it?

  • The uncalibrated accuracy due to offset errors is not going to be sufficient to sense 1A with a 1mOhm sense resistor.

    However the ADC has 12 bit resolution which is more than sufficient and the offset can be calibrated out.

    This is described on page 40 of the data sheet. You should also provide for gain calibration as it is VERY unlikely that your 1mOhm resistor is really 1mOhm...

    Klaus

  • Klaus:

    Thanks a lot;

    The CS2 ADC is of 12 bit resolution, which is based on what average rate?

    the IOUT_OC_fault has two average rate setting of 82us and 328us, so they are not 12 bit resolution, right?

    "The ADC samples at a frequency of 1.56 MHz, and the reading is averaged in an asynchronous fashion. This reading is used to determine actions on faults, such as the IOUT OC fault, with an average rate of 82 μs (seven bits) or 328 μs (nine bits), which is set by Register 0xFE1B[4]. The ADP1051 also reports an output current reading in the READ_IOUT command (Register 0x8C), with an average rate of 10 ms, 52 ms, 105 ms, or 210 ms, as set by Register 0xFE65[1:0]."

  • 0
    •  Analog Employees 
    on Jul 20, 2015 6:46 AM over 5 years ago

    Hi Jun,

    The use of 0.01% level shfting resistor is used for characteration. This means the biggest error that the ADP1051 itself provides is 2.28mV within temperature range, not including the error of sense resisoor. If you considier the 0.1% level shifting  resistor, it will result in additional (+0.1%-(-0.1%))*1V=2mV error at worst case.

    Therefore if you want a ccuracy of 1A, you may have to do the calibration. It allows you to acheve much better accuracy. See page 40 in the datasheet for details.

    James