We are using the ADRV9361-Z7035 with FMC and BOB carriers running Linux OS. On some of our boards, when we set a TX or RX LO frequency the actual value that gets set is off by a few Hz. For example, when running a BOB with a 4 MHz sampling rate, 4 MHz bandwidth, and requesting a 900 MHz LO for both TX and RX, the actual value that gets set (a returns when queried) is 899,999,998 Hz. This is resulting in a slow amplitude fade between TX and RX devices in a radar application using un-modulated pulses. We are setting and reading carrier frequency using the IIO C++ interfaces running on the target devices:
iio_channel_attr_write_longlong(iio_device_find_channel(phy_dev, "altvoltage1", true), "frequency", hz); // TX LO frequency
iio_channel_attr_write_longlong(iio_device_find_channel(phy_dev, "altvoltage0", true), "frequency", hz); // RX LO frequency
iio_channel_attr_write_longlong(iio_device_find_channel(phy_dev, "voltage0", true), "sampling_frequency", hz); // TX/RX sample rate
where "phy_dev" is the pointer to the "ad9361-phy" device and "hz" is of type long long. We are not running any FIR fitters.
Can you provide any suggestions for hardware settings to "standardize" our device configurations or is this something to do with how the PLLs are frequency locking based on hardware parameters (like temperature) that are outside of our control? When we query iio_info, we notice that the tx_path_rates and rx_path_rates for correct LO frequencies come out to be
BBPLL 1024000000 ADC 32000000 R2 16000000 R1 8000000 RF 4000000 RXSAMP 4000000
but these values are slightly off for the boards whose carrier frequencies are off by a few hertz. How can we get more control over these clock divider values other than writing to the IIO sampling frequency?