Hello EZ Team.
My customer has a question regarding ADL5380 EVM dependence vs input signal. They have calculated some parameters using ADISimRF as well as MathCAD. Calculations show that linear EVM degradation is caused by the demodulator noise floor determined by NF and signal noise BW. Calculated SNR is 45dB at the input power greater than -47dBm and SNR becomes better with the input power rising.
But as clear from Figure 87 of the datasheet, EVM of 45dBm achievable only at the input power of -35dBm (blue curve, RF=2.6GHz).
Of course, they understand that EVM is complex specification determined by several parameters: SNR, LO signal quality, phase and amplitude balance, DC error, and nonlinearity distortion, but the difference between SNR and EVM is rather high.
So, the question is: Why does ADL5380 have such a high difference between SNR and EVM of 10-12dB?