I'm trying to better understand the noise spectral density of the AD9162. In our application, we are more sensitive to the white noise added rather than noise due to timing jitter. In Fig. 82, the noise spectral density (NSD) for a single tone increases with fout whereas for WCDMA signals NSD Measured at 70 MHz (Fig. 85) doesn't seem to change and is constant at -170dBm. Is the increase in NSD for the single tone due to phase noise/jitter?
If so, is -170dBm/Hz the NSD of the AD9162 excluding jitter contributions? Where is the extra 4dB noise figure contribution above thermal noise limit (-174dBm/Hz) coming from?