Hi,

I am trying to measure the difference in frequency between two closely spaced 6 MHz signals. This is for a sensitive research measurement instrument. Is it better to mix or multiply to achieve highest freq. diff. resolution? Can you recommend an IC? Assume inputs are differential and can be made to any voltage level.

Thanks

Multiplying, which I will define as the linear multiplication of two signals, tends to be a noisier than mixing. I'll define mixing as the multiplication of one signal with the fundamental frequency of the other signal. Imagine that you turn Signal B into a square wave and multiply that +1/-1 signal with Signal A. Also because the "squared up" version of Signal B has lots of harmonics, those harmonics will also mix with Signal A. Of course with a bit of filtering, you can always eliminate those unwanted products.

My sense is that linear multiplication is the best approach for your application where you are interested knowing the frequency difference and maybe not so interested in stuff like signal-to-noise ratio.

Before I can recommend an IC, I need to know the frequencies of the two signals.