I have two sine wave signals, say A and B. Each has identical frequency (maybe out of phase) anywhere between 5-100MHz and amplitude over 60dB range (1:1000).
However, at any specific time, the maximum difference in amplitude between signal A and signal B is less than 14dB (5:1), and signal A is always greater than, or equal to, signal B.
I need to determine with great accuracy the amplitude ratio, and thought of using the AD8302.
If I understand the datasheet correctly, the maximum error at 100MHz over magnitude rate of 0 to 14dB is around 0.1dB (from TPC3, page 6, datasheet).
I think this translates to about 0.3dB in gain measurement accuracy.
Firstly, is my analysis correct?
Secondly, could you recommend an alternate, more accurate way of achieving this measurement.
If I may, another question.
For the AD8306, the log linearity curves show a sort of undulating pattern across the input range.
Is this repeatable and therefore able to be calibrated out in post-processing?