AnsweredAssumed Answered

ADL5380 input power levels

Question asked by Feit on Jul 15, 2014
Latest reply on Jul 16, 2014 by bsam

See questions at the bottom of post regarding the response below

The data sheet states that the buffered baseband outputs are capable of driving a 2V p-p differential signal into a 200 ohm load.  My question is, what input power level at the RF input does 2Vptp output correspond to?   If I start out at 0 dBm at the RF input, and apply 7 dB of gain, I will see +7 dBm at the I and Q outputs combined, or +4 dBm at each output.  +4 dBm  converts to 2.0044 volts ptp across 200 ohms.  Is that correct?    Does 2Vptp on each output (I and Q) correspond to 0 dBm at the input?


Next question relates to IP1dB which is spec’ed at 11.6 dBm.  What voltage (cross 200 ohms) comes out of the part with +11.6 dBm applied to the RF input?


in Tim Kelvas's discussions Share Reply

Show more

Actions Hide new activityMark unread


2 replies2 new







Hi, Feit,


      First some clarifications: the gain shown on the ADL5380 datasheet is "Voltage Conversion Gain", i.e., the voltage ratio between EACH I or Q output pair (differentially) and the input voltage.  I would strongly recommend to use voltages (and not power) when talking about baseband outputs, as the impedance environment is ambiguous.  While we specify the load to be 450 or 200 ohm differentially, the differential output impedance is actually 50 ohm.  A load impedance of 200 ohm or higher is recommended for lower distortion.


    When referring to power, RF engineers often mean the power available in a matched condition, but a 50-ohm matched load at the baseband outputs is not recommended for distortion purpose.  While the power-to-voltage conversion you quoted between 4 dBm and 2 Vpp is correct across a 50-ohm load, it could be confusing for the baseband outputs.  Ofter the ADL5380 is used to drive a resistor terminated ADC input, and the voltage across the load is the parameter of interest.  Back to your example: for the 11.6 dBm (2.4 Vpp into 50-ohm) input P1dB, the nominal 7 dB (voltage) conversion gain drops to 6 dB, and the output is 2.4 Vpp * 10^(6/20) or 4.8Vpp into 450 ohm (the specified load).  Of course, at this level, the part is in compression and the outputs are heavily distorted.


    Working backwards, a 2Vpp output at 7 dB of voltage gain corresponds to 2/10^(7/20), or 0.893 Vpp  at the input, i.e., 2 dBm, as the input impedance is 50 ohm.


    Hope this helps to clarify the confusion about the gain and output power.




Questions regarding the above response:


  1. 1. I tried to verify the “Working backwards” calculation below, to determine input power with 2Vptp at the output.  I arrived at the same 0.893Vpp value but when converting to power, I get 2 milliwatts of +3 dBm.  Where does the 2 dBm figure below come from?  Just a typo or am I missing something?


  1. 2. At P1dB, the differential output voltage can reach 4.8Vpp,  If  the device is only capable of driving 2Vptp at the output, how is P1dB measured? How would we go about verifying P1dB if we had to do it?


  1. 3. Can the device be driven single-ended? (data sheet suggests that it can, but recommends differential drive through as balun).  If the device is driven single-ended both at the LO and the RF input, what specifically is the performance penalty?


  1. 4. The data sheet shows performance at 19.GHz, 2.7 GHz, 3.6 GHz, and 5.8 GHz.  Is there any specific problem at 5 GHz that we should know about? How do we predict 5 GHz performance from the supplied data? Simple interpolation? Use 5.8 GHz data?


  1. 5. The device is designed to output 2V ptp into a 200 ohm load.  My application requires 1Vptp into a 100 ohm load.  What is the best way to provide this?