Post Go back to editing

High voltage measurement

Hi, I would like to accurately measure voltages on the range of ~200V at frequencies up to 1kHz. I have been trying to use voltage dividers made with resistors but, unfortunately, I haven’t been able to make any which provides reliable phase information at my maximum frequencies. I was wondering if there was any ADC able to read such voltages. Otherwise, can someone point me to a better solution?

Thank you

Regards,         

JMGL

  • Hi,

    I moved this discussion to the precision ADC  community, someone here should be able to help.

    Regards,

    David

  • Hi, JMGL.

    We are looking into this. I'll get back to you as soon as possible.

    Thanks,

    Karen

  • What is "accurately" for you?

    To measure AC frequencies, you need to compensate your resistive divider, e.g. by placing capacitors in parallel to the high and low voltage branch. The LV capacitor should be adjustable to compensate the stray capacitance from the high voltage side. Then you can use the step response to compensate like you do it with oscilloscope dividers.

    You should also use non-inductive resistors (e.g. SMD's)

    The most accurate AC voltage divider is a voltage transformer (PT) or "reference transformer".

    Best Regards,

    Dirk

  • Hi, JMGL.

    I'd just like to clarify if the 200V you’re trying to measure is differential or common-mode? Can you share more on your intended application so we can help you better? I have checked our precision ADCs portfolio, and the highest input voltage range we have is that of the ADAS3022 at a ±24.576 V differential input voltage range when using ±15 V supplies.

    Regards,

    Karen

  • Hi JMGL,

    Have you considered the AD7401A isolated sigma-delta modulator? You would still need the resistor string but the flexibility of your implementation of a digital filter after the modulator may be attractive.

    Rgds,

    Nicola.

  • Hello,

    Thank you all very much for your suggestions, and sorry for not have answered before. I think that first of all I am going to give a slightly longer introduction to my problem.

    We are driving a piezo-ceramic actuator with voltages within ±200V respect to ground and currents lower than 0.2A. To do it we use a special amplifier for capacitive loads (A-303 from lab systems LTD) which cannot stand inductive loads (that is why I think I cannot use a transformer). The whole necessity of measuring the voltage and current is because we want to provide the power of the actuator. Unfortunately, when the actuator is more efficient, e.g. at resonance, the voltage and the current have phases close to -90deg. This means that small errors in the phase between current and voltage can cause relatively large errors in the power obtained.

    So, before even knowing that a voltage divider could produce a phase delay, I made my first voltage divider to reduce the voltage about 21 times, and obtained really strange results. After some research I set this phase delay as the responsible of my problems. I found out which were the best resistors that I could use for this purpose, I made a second voltage divider and measured its transfer function, which I attach. As you can see, I have errors of around 0.08rad~4.5deg at 1kHz. I tried to calibrate for this but the transfer function seems to vary with time (even though I got the most stable resistors I could find) and with the voltage applied to the divider (and I can only measure it up to 10V).

    It is difficult to know what the maximum tolerable phase delay is, but it seems that the phase between current and voltage at resonance is around my delay, because when I correct for it, depending on the voltage I get positive and negative values of my average power. So, I guess I would be happy with something about one order of magnitude less, let’s say 0.005rad at 1kHz.

    Sorry, but I am just surprised that measuring high voltages is that difficult. Anyway, if you have any idea, please, let me know.

    Thank you for your help!

    jmgl

    attachments.zip
  • Hi, JMGL.

    Apologies for the delayed response. Unfortunately, we do not have an ADC that would directly interface with your system. We do have products like AD760x that can be used in power line monitoring applications and the AD7280A for battery monitoring applications other than the ADAS3022 I mentioned before, but again, their input voltages are not compatible with what your applications.

    As for the discrete solution, Dirk made a good point about compensation that you could look into.

    I am moving this to the Interface and Isolation community. Someone here should be able to help you. Here's a recent thread that's also on high voltage measurement.

    Regards,

    Karen

  • JMGL: Looking at the your description of your application, I think it is important that the resistor divider you use has a high resistance so that it does not consume too much power and cause drift in resistance value.  In the response from Nicola to use the AD7401A, you could use a 1000:1 divider to take your +/-200V and divide down to +/-200mV.  If you use a 10M and a 10K divider you would reduce power consumption to 4mW.  You would need to compensate with capacitive divider too as described by Dirk.  Nicola could provide support on the AD7401A.

    Regards, Brian

  • Hello bkennedy

    Yes, that is a good point.

    At the moment I am using a voltage divider composed by 2x1MOhm and 100KOhm, providing a gain of 21:1 at DC. I selected these values to be sure that my divider had an impedance much smaller than my DAQ (which says >100MOhm). I could maybe increase it a bit to reduce the power consumption. My main problem now, nevertheless, is the phase delay that this divider produces at the frequencies that I am measuring; the power consumption is not a priority. I nevertheless appreciate your comments and will take them into account in the future.

    Regards

    JMGL      

  • Hello,

    Yes, I think I will have to look into the compensation by capacitors. I have never done something like that before, so I will have to spend some time learning about it. I guess the main problem that I will find is that, as I said, the shift seems to depend on the voltage and, since I can only calibrate at ±10V, I will not be able to guarantee a maximum shift at the voltages I am operating at (±200V). This, at the end of the day, is the same problem that I have now when I try to compensate with my calibration via software. Am I right?

    Regards,

    JMGL