Post Go back to editing

ADRV9009 maximum input power

I've put various pulsed CW signals of different pulsewidths and duty cycles, up to and including 100% duty CWs into the ADFMCOMMS8 at power levels up to +18 dBm and I have not been able to observe any obvious damage or degradation of the device.  We did not expect this to be the case based on the datasheet and some other forum posts.


I believe I have the device in manual gain control and at its max sensitivity settings (no losses in the input attenuator).  Other than concluding my results are valid, my only other hypothesis is that something under the hood is intervening and setting the input attenuator when it sees the large signal to protect the device without my knowledge.  I've read back the gain control mode and the gain table setting while applying the large CW signal and they come back as manual and "30 dB" which I understand to be the no attenuation setting with the default gain table.


If my results are valid this is really good news for us, but I don't want to take it to the bank yet since we expected a different result (damage somewhere in the -7 dBm to 0 dBm range).  Can you help me sanity check this?

  • Refer to the maximum power rating section in ADRV9009 datasheet. Note that 23dBm peak is the maximum power that the chip can handle without damage, but performance is not guaranteed at such high powers.

    Performance is guaranteed only till a maximum of -11dbm input power at RX port. Refer below snippet from ug

  • Thank you for the response.  We were under the impression that the +23 dBm may require the chip to be using AGC to protect itself.  This likely came from the fact that the maximum input power is ~ greater than the max usable input by the gain control attenuator range.

    However, based on the testing are concluding the +23dBm (or at least up to +18 that we tested, and not over temperature) is not dependent on AGC.  We understand that performance at signal levels past full scale input are not usable.  We are just looking for "survivability" at that point.

    So I am taking away that regardless of settings (manual vs. automatic gain control, etc.) and environment (temperature, etc.) that the chip should survive up to +23 dBm without damage or degradation in performance (ones signal levels return below the full scale (max usable input level).  Is that the proper takeaway?

    In our design it should never see more than +18 dBm so that seems like a healthy margin of safety to the +23 dBm.

  • The +23dBm maximum input level is dependent on AGC control.

     If you are looking for +18dBm without AGC control, then the reliability analysis has not been performed under this condition and hence we cannot guarantee the product lifetime.