I'm designing a meter based on the ADE7753. I'm using a phase-accurate voltage transformer for isolation and Rogowski coil for the current sense (which also conveniently provides isolation needed). I am expecting to set the current sense input full scale value to 0.5V and the GAIN to 4 or 8 based on input from this forum in an earlier thread.
I've extensively read the datasheet and several appnotes, including those not specifically for this part. AN559, for example, shows a design based on an older part, which appears to have a limited range of ratios to adjust CF for the given LSB/Wh. It works around this by using a calibration resistor network to adjust the voltage sense signal level to meet the number derived via calculations. The resulting level just so happens to be 248mV, or near the middle of the input range -- a number that is also consistently recommended for every part, including the ADE7753.
But it seems to me the ADE7753 is designed such that the sensor output voltage levels are more or less independent of the calibration process. As long as the levels are set appropriately given the voltage / current ranges expected (thus guaranteed not to exceed the maximum of +/- 500mV peak), and the current sense input PGA is set accordingly for the current transducer in use, the calibration process should not require further modification of the current and voltage sense signal levels. For example, in the case of CF output, the CFNUM / CFDEN / WDIV numbers appear to provide reasonably fine-grained control of the CF output.
Am I correct in assuming that I can "set and forget" the current / voltage sense input signal levels in hardware and then perform all needed offsets in software? Or should I bother to integrate some convenient means of adjusting the voltage sense input signal level via resistor calibration network or quality pot?