I have actually dumbed down the question for a better understanding. The thing is if I run a 'pnoise' simulation for 'jitter' analysis of an inverter with a 100 MHz clock (with 50% duty cycle) at its input in Cadence Spectre , what are the limits of integration that I should take for calculating the integrated RMS jitter at the output. Elsewhere, I read that if I should integrate from DC (pink noise) to half of the fundamental which should be 50 MHz for this particular case, but that doesn't seem to make sense to me, since for calculating jitter, the theory is that I sample the output noise (in time domain) at the transition points (since this is where the jitter matters) which means for a 100MHz (10 ns) input, I sample my output transient (which is internally done by the simulator) at 0 sec, then 5 ns and so on and so forth, which means that I sample twice in every period. This translates to a repetition of 200 MHz in frequency domain and hence by this extrapolation, I should set the limits of integration from DC to 100 MHz (as against 50 MHz).Could someone please elaborate and see if my line of reasoning is correct?

In general these forums are only set up to address questions on specific Analog Devices components. So I am not sure that we can address your question. I have moved your question to the Clock and Timing community.