I'm looking to calibrate the AD8364 power detector using a method similar to what is described in the below conversation, but because I don’t have ADC access to OUTP and OUTN in the Alpha prototype

design, I was simply going to use OUTB and OUTA measurements to derive the OUTP

& OUTN voltages.

Normally, this would probably be

good enough – but I have another wrench thrown into the mix, because the OUTB

is adjusted for a slope of 100mV/dB using external resistors, while OUTA is

maintained as the nominal 50mV/dB of the part. Is there a way to account

for this slope delta in the equations and still use OUTB and OUTA voltages to

calibrate the detector for return loss/VSWR?

See previous post below:

https://ez.analog.com/docs/DOC-11868.

Thank you!

I think that the easiest thing to do is to use the equations to convert both voltages to power. So starting with

Vout = SLOPE x (Pin - Intercept)

we re-arrange the equation to get

Pin = (Vout/SLOPE) + Intercept

With two channels, you then have two equations

PinA = (VoutA/SLOPE_A) + InterceptA

PinB = (VoutB/SLOPE_B) + InterceptB

Return Loss = PinA - PinB