I have inherrited a design and am not sure what tradeoffs impacted the choice of reference frequency and if it could be simplified / reduced:
The design uses a 10 MHz reference, scales it to 200 MHz using a jitter cleaner (LMK04821) and then passes that to a ADF4350. The ADF4350 provides signal between 950 MHz and 2150 MHz at 100 kHz steps.
The LMK04821 also provides a 200 MHz to an ADC at the end of the receiver.
Unfortunately the 200 MHz harmonics are picked up in the receiving signal chain (though they are low).
My thoughts are that, scaling the 10 MHz up to 200 MHz and then down inside the ADF4350, adds no phase noise benefit and is likely to reduce performance. Would it not be better to take the 10 MHz directly or a low multiple of it as long as it is high enough for the phase detector frequency? Or would I get better performance scaling even higher to say 250 MHz?