I'm attempting to measure the true 'average' value of a time varying signal over a variable time interval. The input signal varies between 0 and 2.5 volts but may have components from near DC to a fixed frequency of a few hertz. The time interval of interest varies between 0.25 seconds and 10 seconds. I am using a AD7741 V/F converter with a 6.14 MHz crystal to created an output frequency and a 32 bit counter to count pulses over the time period of interest.. The conversion of the counts to the 'average' input is straight forward. The kicker is that the resultant value must be accurate to .005%. To do this, for a dc voltage of 2.5 V, the output frequency (about 2.5 MHz) has to be constant within 125 Hz over the count time. If the input is 0V, the stability of the output frequency( about 500 KHz) has to be constant within 30 Hz. Unfortunately, the frequency variability I see is more than double the minimums above.
For test purposes I use a 5 volt supply that is stable to 0.1 mv. The test input signal is derived from the 5V supply through a multi-turn potentimeter. The count time of 1 second is derived from a second count-down timer using an input of 100 MHz clock. Is there a way to increase the stability of the ouput frequency, OR can you suggest a more accurate circuit/component that will be with the specs above.