Hello.

To put it most simply, I am observing that the ramp up time is different than the ramp down time, despite having identical parameters set in the registers. Is this supposed to happen? If so, then why? And am I able to fix this?

Thanks in advance.

~Dan

(I will provide details of my observations in my next post.)

I have a lot of timing data collected for various ramp conditions, but for now I will just provide the numbers for one data set.

The DDS DRG settings were as follows: low freq at 200MHz, high freq at 400MHz, pos/neg ramp time at 1000ns, and pos/neg ramp step at 100kHz. So I should expect the ramps to take about 2ms to complete (i.e. the time between the start and stop edge of DROVER should be 2ms). Indeed, the average for my whole data set was 1.999016ms. However, the points do not form a nice Gaussian distributions as one might expect. Instead, the average ramp up time was 36ns longer than the average ramp down time (and both of these data subsets had standard deviations less than 1ns, so I am certain that there is a real difference between the two ramp times).

The details above describe a deterministic difference in time between the length of the ramp up and the ramp down. I have also found some seemingly 'random' discrete timing discrepancies, but I don't see a need to cloud my main issue quite yet.