I am trying to select a current transformer/burden resistor combination that can measure a 50Amp 240VAC load. My first concern is the voltage level coming into the meter chip, we would like an amplitude as high as possible so that it will be more immune to noise as we have a long lead to the CT. I was told by the EE who designed it that it would handle 1 V peak to peak (±0.5V total) but when I read the specification it states "These inputs are fully differential voltage inputs with maximum differential input signal levels of ±0.5 V...The maximum signal level at these pins with respect to AGND is ±0.5 V." and I understand this to mean that the ADC should not work above a half of a volt difference measured between V1P to V1N (both positive and negative) because the ADC would be out of its range, and that this difference needs to remain within 0.5V of AGND. The problem is that it works just fine when we select a burden resistor that runs the input at upwards of 1.000V. The attached image shows a scope dump from AGND to V1P (CH1 Yellow) and AGND to V1N (CH2 Blue). When this differential voltage is ±0.440 to 0.900 V, the ADE7763 remains very accurate (0.2% using very imprecise comparison means) and at 1.050V it suddenly becomes 10% off which makes me think it is clipping the peaks off due to limits in the ADC. Can we expect accurate results at 0.999V because my EE is reading the specification correctly or should there be no expectation at anything above 0.500V as I understand the spec? Does something else come into play here? Our GAIN register 0x0F is set to 0.
Thank you in advance for your comments and wisdom.