I recently acquired a HMC856 EVAL board - for characterization purposes of course. Now I'm interested in measuring the actual delay values per step as accurately as possible in order to characterize the chip for my desired application. I need to know the additional delay of each step as exactly as possible.
Now I'm wondering: what is the best/most exact way to measure the delay? I'm using an oscilloscope (20 GS/s) and it's not possible to simply measure the time increment of each single step due to oscilloscope jitter. The signals I'm measuring have a very low RMS jitter (<< 200 fs as measured with a phase noise analyzer) and are in the frequency range of 0-5 GHz.
How did ANALOG (or Hittite) measure the values given in the datasheet? Any hints or tips on how to measure these time delays would be very much appreciated. Maybe salemdar can give any tips?