# LTC1968 Output Noise

Hello,

I can't make sense of the output noise specs of the LTC1968 RMS-to-DC converter.

The datasheet (graph 1967 G18 and 1968 G19) shows something called "PEAK OUTPUT NOISE".

However, I can't figure out how to use this information; What is the frequency of this noise? What happens at higher frequencies (>1MHz)? Does the noise just keep rising, or does it peak at 1 MHz (half the sampling frequency) and then decreases until 2 MHz and then repeats the same pattern every 2 MHz?

1967 G18 says "PEAK NOISE MEASURED IN 10 SECOND PERIOD", and then it shows different curves for different averaging capacitor values.

How was this measured? Does this mean that there will be a ripple with a period measured in seconds?

I tried to simulate in LTSpice, but I have not seen any output noise comparable to the values in the datasheet, even when letting the simulation run for 10 seconds (simulated time) and analyzing the data with MATLAB.

I tried to understand how the thing works in the inside to get a better understanding, but with not much luck.

In the datasheet, for example, it says, that "LTC1968 RMS calculation is inherently wideband, operating properly with minimal oversampling, or even undersampling, using several proprietary techniques to exploit the fact that the RMS value of an aliased signal is the same as the RMS value of the original signal."

But this doesn't sound true. For example, if we look at a signal which is the sum of a 500 kHz and a 3.5 MHz sine wave with amplitudes of 1, then the RMS of the original signal is 1 but the RMS of the aliased signal, if we are sampling with 2 Mhz,  is (1 + 1)/sqrt(2) = sqrt(2).
I also didn't manage to understand how the multiplication works with the +-1 buffer: As I understand it, the average of the signal on the output of the buffer will only be a product of the input signal and the average duty cycle of the modulators output, if the frequency of the modulators output signal (the maximum value for this is the sampling frequency) is much higher then the input signal's frequency.

I want to detect excessive ripple on the output of a switching circuit whose switching frequency is 1.5 MHz. So an exact measurement is not necessary, but I would like to have an idea of the measurement error to be able to calculate the accept and reject bands for the measured ripple RMS. I already have the circuit made, and I just now discovered this when I was trying to sum up the errors. In the datasheet the -3dB bandwidth is 15MHz, so I thought this will work fine.

But right now I am a bit lost, I don't know what digital filter I should implement because I don't know what this noise is.
There is a 2nd order Sallen-Key filter with 4.5 kHz cut-off already on the output of the LTC1968 after the 1uF averaging capacitor, so I think the only noise I need to worry about is low frequency noise which I can handle from the microcontroller by oversampling and a digital filter.

I need to know what I am trying to filter out though.

Parents
• The LTC1967 or LTC1968 is the right choice to measure the RMS of a 1.5MHz input.

These parts use a delta-sigma ADC in the generation of the RMS output, which makes them very accurate and linear for lower frequency inputs.

For higher input frequencies, there is noise aliasing down near DC into the output.

There is nothing one can do about it except increase the output averaging capacitor, which will impact the settling time.

AD637 is a traditional precision log-antilog RMS-to-DC converter for MHz signals.

Reply
• The LTC1967 or LTC1968 is the right choice to measure the RMS of a 1.5MHz input.

These parts use a delta-sigma ADC in the generation of the RMS output, which makes them very accurate and linear for lower frequency inputs.

For higher input frequencies, there is noise aliasing down near DC into the output.

There is nothing one can do about it except increase the output averaging capacitor, which will impact the settling time.

AD637 is a traditional precision log-antilog RMS-to-DC converter for MHz signals.

Children
No Data