Post Go back to editing

The accurate transmitting time of the first baseband sample on the Adalm-Pluto

Dear Engineers,

I need to get the transmitting time of the first baseband sample on the Adalm-Pluto to control the external device. I set the Adalm-Pluto as TDD mode and also made it work under PinCtrl mode so that I can leverage GPIO to monitor the ENSM entering TX mode. My code is attached.

I have successfully received the signal. But I have two questions:

1) I must assign ENSM as Tx mode in the program (sdr._ctrl.attrs["ensm_mode"].value = "tx" ). Otherwise, the Adalm-Pluto cannot transmit the signal. I am very confused. Since the program runs to sdr.tx(), does not it mean the Adalm-Pluto begins to transmit, and the ENSM automatically enters TX mode? Why does it still need me to assign ENSM as Tx mode?

2) I use an oscilloscope to monitor the GPIO0. When the ENSM enters the alert mode, the GPIO pin is low-level logic because of the program (sdr._ctrl.debug_attrs["adi,gpo0-inactive-state-high-enable"].value = "0" #GPIO logic low in alert mode). However, I found that when the alert mode ends (the GPIO pin becomes high-level logic), it is not the transmitting time of the first baseband sample. I think the ENSM has entered TX mode. But it looks like that Adam-Pluto still needs some time to transmit the first baseband, and the time is not constant. How can I leverage the GPIO or other methods to get the accurate time of the first baseband sample?

import adi
import iio
import time
import numpy as np
import as sio
import matplotlib.pyplot as plt

sampleRate  = 4e6    # must be <=30.72 MHz if both channels are enabled
centerFreq  = 2.40e9
txGain      = 0

# Create radio
sdr = adi.ad9361(uri='ip:')

'''Configure State Machine'''
sdr._ctrl.debug_attrs["adi,frequency-division-duplex-mode-enable"].value = "0" # TDD
print("ensm_mode_available:", sdr._ctrl.attrs["ensm_mode_available"].value)

sdr._ctrl.debug_attrs["adi,ensm-enable-txnrx-control-enable"].value = "1" # Pin control
sdr._ctrl.debug_attrs["adi,ensm-enable-pin-pulse-mode-enable"].value = "1" # 1 Pulse

sdr._ctrl.debug_attrs["adi,gpo0-inactive-state-high-enable"].value = "0" #GPIO logic low
sdr._ctrl.debug_attrs["adi,gpo0-slave-tx-enable"].value = "1"
sdr._ctrl.debug_attrs["adi,gpo0-slave-rx-enable"].value = "0"

sdr._ctrl.debug_attrs["initialize"].value = "1"
print("ensm_mode1:", sdr._ctrl.attrs["ensm_mode"].value)

'''Configure Tx properties'''
sdr.tx_rf_bandwidth         = int(sampleRate)
sdr.sample_rate             = int(sampleRate)
sdr.tx_lo                   = int(centerFreq)
sdr.tx_hardwaregain_chan0   = int(txGain)
sdr.tx_enabled_channels     = [0]

'''TX DATA'''
data_dic = sio.loadmat('bleTxWfm.mat')
data_lst = data_dic['txWfm']
data_sig = data_lst[0]
data_sig *= 2**14

while True:
    sdr._ctrl.attrs["ensm_mode"].value = "tx"

I also found that if the BLE or ZigBee baseband data generated by MatLab is transmitted by Adalm-Pluto, it can be received by the BLE or ZigBee packet sniffer. But if the baseband data is transmitted by the USRP (same sample rate, center frequency, gain as Adalm-Pluto), it cannot be received by the packet sniffer. Is there any difference in processing baseband data between Adalm-Pluto and USRP?
[edited by: XinLiu at 3:03 PM (GMT -4) on 24 May 2022]
  • 1. ENSM and RX and TX buffer control from the perspective of software are separate. tx() just pushes data into the FPGA.

    2. That pin will only tell you the ENSM state but not the relation of the actual data. When the ENSM is not in TX mode data will wait at the interface in the FPGA until switched into TX mode. So there can be uncertainty if there is data in front of your buffer, time to travel across the interface, and time to travel within the chip. This would need to be measured with external equipment as the latency is dependent on configuration.