We got our first samples back that has our device with the ADI regulator (ADP1607ACPZN-R7) on there. We are seeing something that I don't understand and may make this regulator unusable.
We are trying to get an output voltage of 1.875V with a max voltage is 1.95V and our min voltage is 1.83V.
So, we put R1=137k and R2=280k. We used 1% tolerance resistors.
Also, the spec sheet says that Vfb = 1.259 +/- 2%.
The maximum error due to the feedback current is 0.25uA*137k = +34mV.
Adding all these things together, we are supposed to see an output voltage the never exceeds 1.96V and never goes below 1.83V.
We measured 10 parts and here is what we see:
- 1.97V, 1.95V, 1.91V, 2.00V, 1.95V, 1.92V, 1.87V, 1.87V, 1.87V, 1.90V
These are all with very light load currents (on the order of 10mA).
What is going on? Why am I seeing 20% being bad in only a sample size of 10? This means in high volume things are going to get way worse.