My current settings:
#if defined(ADI_ADSP_CM40Z)#define CLKIN (30 * MHZTOHZ)/*30*/#define CORE_MAX (240 * MHZTOHZ)#define SYSCLK_MAX (96 * MHZTOHZ)/*96*/#define VCO_MIN (72 * MHZTOHZ)
I've tried some variations but never successful
Ideally I would like to run both channels at 1 MHz sample rate. Is this realistic?
In the sample, they use a GP timer as a peridic way to start the ADCC timer. They have given 1000 as timer period. For easy discussion, assume the SCLK = 100Mhz = 10nsec, hence the timer period is 10usec. Every 10usec interval, the ADCC timers get started. The minimum conversion time is 380nsec, which is a time for 1 phase. There are 3 phases (control, conv, data), each of this time. The time for the next ADCC event must coincide with multple of conversion time for more exactness. Although 1 usec seems achievable, there will be delays in software, which can be worked around with 2 timers. See this App Note for better clarity on most of the things I mentioned above. https://www.analog.com/media/en/technical-documentation/application-notes/EE365v01.pdf