How can I know the values for fine tune and coarse tune of crystal oscillator used in AD9364?
Do we need to conduct any experiment to fix those value?
Please refer https://wiki.analog.com/resources/eval/user-guides/ad-fmcomms2-ebz/hardware/tuning?s=dcxo
In that blog it is mentioned that once the error ppm is calculated, then specify the rectified clock frequency value in sysfs. What is this sysfs? Once I found out the offset in the Oscillator frequency, how can I estimate the fine and coarse tune value from that offset?
There are coarse tune and fine tune registers that can be modified using ad9361_set_dcxo_tune function. The resolution of the DCXO varies with coarse word with a worst-case resolution (at coarse word = 0) of 0.0125 ppm. Using both coarse and fine words, the DCXO can vary the frequency over a ±60 ppm range.
The tuning will be a continuous process where you will be increasing or decreasing the word as per the frequency error.
A factory calibration could also sweep the DCXO fine tune codes to produce a LUT of frequency error vs. fine tune code to allow quick updates of the XO tuning
How can I do factory calibration?
Can I have access to that LUT?
Please read below for details.
You need to create your own LUT as required.
i know this question was asked a long time before. Was this problem solved?
I wondering about the same. I do understand that it is possible to fix the variation of the oscillator by setting up the tuning register 0x292 - 0x294.
But like kannan asked, were do I get the correct values for my own design? Every single oscillator will have a different tolerance within the range of for example +-10ppm.
I saw that the evaluation software has an auto calibration mode. How is it realized? What is the reference clock for measuring the XO input clock?
I doubt that the AD9361 (in my case) does have an internal logic to automatically calibrate the tuning register.
Thanks for supporting
On devices such as the AD9361 or AD9364 where the DCXO exists, and an external XTAL is connected, the DCXO can be used. On devices like the AD9363 or on setups where an external fixed frequency TCXO is connected, the absolute reference frequency can be adjusted.
There are many ways - one that was realized (and is supported by the HDL cores) is to use a GPS 1PPS to measure the baseband rate and adjust the reference. Other ways are to tune to a known frequency pilot tone and adjust the DCXO or reference. Alternatively during factory calibration measure the reference frequency and provide a tuning value in some none volatile storage.
thanks for fast answering.
Please excuse me for asking, but I'm rather a software engineer, than a RF engineer.
Let me explain our setup. For testing and evaluation of the AD9361 we have the ARRadio Evalautionboard.
Reading the article "Tuning the AD9361/AD9364" shows to me, that there is some way to automatically calibrate the DCXO. From the schematics of the board I can see, that the XTAL inputs are connected to an external crystal. The CLK_OUT is connected to the HSMC connector. I guess the clock will be measured by the AD software runing on the main board (SoCkit from terasic in our case). Is it correct, that the software compares the clock frequency coming from the ARRADIO to a clock of the main board?
Can you please explain how the automatic calibration for the DCXO is realized?
Retrieving data ...