I am currently working on the calibration of an ADE7880 (I am working on the EVAL-ADE7880EBZ). For this I used the application note AN-1171. Here is the procedure that I carried out for this calibration:
First of all, I used a precise source (Omicron CMC 256 plus) to generate the precise currents/voltages.
2. Using the AN-1171 formula, I calculated BIGAIN and CIGAIN and updated these values.3. I calculated the value of [I/lsb] by dividing 5Arms by my decimal value obtained on the ADE7880 (on phase A).
4. I then calculated the values obtained for AIRMS in Arms using this conversion value, for different voltages in the range (0.1/0.5/1/3/5Arms) to see if my accuracy approached the expected 0.1%.
5. I then arrive at very poor accuracies. On phase C for example: 5.8% for 0.1Arms, 1.4% for 0.5Arms, 0.75% for 1Arms, 0.16% for 3Arms and 0.01% for 5Arms.
6. I performed exactly the same procedure for the voltage by generating a voltage of 230Vrms and then checking the range of the signal (10/50/110/230/300Vrms) and I also obtain insufficient precision.
To be noted :
- I use current transformers (CR8410-100, Ir = 20A, r=1000) to measure the current.- The current source (CMC 256plus) is accurate to within 0.1% over the entire range. So I think it's normal that I can't reach an accuracy of 0.1% (specified by the ADE7880) over the whole range for current and voltage because the source is not precise enough, but I'm still very far from that at the moment.
Can you tell me if my procedure is correct? How can I improve these results?
Thank you in advance for your quick help and best regards,
For the voltage divider typical the voltage only varies 10% in the field so this in not usually and issue but.. We have found a single 1M resistor is not the best choice for performance. The capacitance…
Good to see this fixed the VRMS issue. The issue now is noise. In a perfect work with low noise .1% is possible. Your source might be noisy at 10V and 50V this can be due to auto ranging in the source.…
This is the ct or the source. Yes at pf1 angle errrors are less of an issue than 0.5. This is why we suggest doing phase calibration at .5
What are your IRMS and VRMS hex values with 5A and 220V ?
Is the PGAGAIN of 1?
What is the burden you are using ? Are you using 2 resisters for a differential burden?
On the ADE7880 eval board is there a single 1 Meg resistor or a couple 500K?
This looks like and offset issue which means you either have to much noise or cross talk from the voltage channel.
If you look at IRMS at .1A and vary the voltage does the error on the IRMS measurement change?
What are your IRMS and VRMS hex values with 5A and 220V ?With xIGAIN at 0x00, here are my values for IRMS : - 0x18BDAC (AIRMS)- 0x18EF8D (BIRMS) - 0x18955A (CIRMS)
With xVGAIN at 0x00, here are my values for VRMS (220V) : - 0x25B0AF (AVRMS)- 0x259E21 (BVRMS) - 0x259FDE (CVRMS)
Yes it is.
I use a 30 Ohm burden (separated into 2 burdens of 15 Ohms to have a differential burden), this allows me (in theory, a maximum full scale current of 11.93 Arms).
There is a single resistor of 1 MOhm.
The measured value changes slightly if I change the voltage.
For example: - With 0.1A and 1V, I have the value 30132d (average)- With 0.1A and 10V, I have the value 30353d (average)
Do you have any advice on how to solve the problem? Not sure if I understand the "cross talk from the voltage channel" ?
Thank you a lot !
For the voltage divider typical the voltage only varies 10% in the field so this in not usually and issue but.. We have found a single 1M resistor is not the best choice for performance. The capacitance between the pads of the 1 Meg resistor will create leakage that is proportional to the voltage input and cause non linearity. Short the 1 Meg resistor and add 2 500K or 3 330K in series as shown. See if VRMS improves.
As for the current channel.
Please verify current (.1 ,.3, 1 ,3 ,5)using a hand held meter set to current does the measured current match the supplied current?
I possibility is your ct is non linear. I noticed there is no indication of the linearity of the ct on the web site. Try a different manufacture.
Irms and Vrms offset can be used to correct the error but you should try to minimize the error first.
I tested with 3 resistors of 3 x 330 ohms, the result is clearly better for voltage accuracy!
I get the following accuracy for phase A :10V : 0,9%50V : 0,13%110V : 0.14%200V : 0.02%230V : 0%250V : 0,02%300V : 0,05%
After modifying the AVRMSOS register, I get the following details :10V : 0%50V : 0,12%110V : 0.15%200V : 0.02%230V : 0.0%250V : 0,02%300V : 0,05%
I measure voltages with a multimeter with an accuracy of at least 0.1%.
So I come with the following questions:
I checked the RMS currents, they match what I inject. I will order other CTs and see if I get better results. I will get back to you after this.
Good to see this fixed the VRMS issue. The issue now is noise. In a perfect work with low noise .1% is possible. Your source might be noisy at 10V and 50V this can be due to auto ranging in the source. Try using the FFT in the eval software to see the noise from the source at your test levels.
The ferrite have very low current and should not cause to many issues. The ferrites only needed for system level EMC and only used if there is an issue. Shorting is fine.
By calibrating correctly (xIGAIN/xIRMSOS) for RMS current measurement, I was able to obtain better results for example:
0,1 A : -0,83%0,5 A : -0,56%1A : -0,07%1,2A : 0,0%2A : 0,22%3A : 0,33%5A : 0,48%
I also observed the linearity of my CTs to see if the current/voltage conversion was linear over the range: Here are the results I get, for example, if I take the conversion at 1.2A as nominal:
0,1 A : -1,76%0,5 A : -0,21%1A : 0,014%1,2A : 0,0%2A : 0,269%3A : 0,373%
By then subtracting the error of my CTs from the measurement error I obtain the following results:
0,1 A : -0,93%0,5 A : -0,35%1A : 0,084%1,2A : 0,0%2A : 0,049%3A : 0,047%
Do you agree with his hypotheses, do they make sense?
Thanks for your feedback,
Yes I agree.
I have one more question. I am currently using an "Omicron 256plus" accuracy source to calibrate my ADE7880, which allows me to generate currents on 3 phases (between 0 and 12.5 Arms) and at the same time voltages on 3 phases (between 0 and 300 Vrms).
I can manage the power factor by then managing the phase shift between the 2 outputs of the source (current and voltage). In this way I can easily obtain a PF of 0/0.5 or 1.
Is the technique correct?
I have also noticed something strange. My phase shift between voltage and current changes according to the current I apply (for example I have a certain phase shift at 230V/1A, and a larger phase shift at 230V/10A). This completely distorts the accuracy of my measurements and makes the calibration of the phases wrong depending on the current flowing.
I have noticed that this effect is less when I have a PF = 1, and very large when I have a PF of 0.5.
Could this be due to the fact that my current transformers are not good enough? Or could it be because of my accuracy source?
I would rather say from my current transformers but I would like to have your expert opinion.
I did the calibration phase at pf = 0.5 .
What I find strange is that then when I check my calibration, I get really good results for energy measurement with PF of 0 and 1:
On the other hand, when I use a PF = 0.5 for example, I get very poor precision (0.3% to 5% depending on where I am in the range). I don't understand why the results are so much worse with a PF = 0.5 compared to the results with PF = 0 and PF = 1.
"Yes at pf1 angle errrors are less of an issue than 0.5" -> I think it has something to do with that, but I don't understand why?