ADRV9009-ZU11EG SOM and triggers for time-coordinated Tx/Rx


I have been designing an SDR for a radar application. While I have been getting up to speed regarding IIO, LIBIIO, the HDL reference design, and the Linux kernel, I have not been able to find as much info as I would like regarding the topic of triggers and their use in IIO. I have found the following sources here, here, here, and I've trolled through many forum posts without finding the info I am seeking. I should note I am not very experienced (~6 months post uni), so if this is simple and I am oblivious to that, I apologize. Also, for the following wall of text, I apologize.

For the purposes of radar in general, it is necessary for a time-coordinated transmit/receive to occur. My overarching goal at the moment is to modify the provided SOM firmware/software to incorporate such a time-coordinated Tx/Rx. Essentially, I want to transmit a waveform with some number of samples at some sampling rate. I then want to get Rx data for X amount of seconds after the Tx waveform is finished sending, then repeat. I also want minimal transition time from Tx mode to Rx mode. For the moment, any kind of Tx-Rx time-coordination would be spectacular.

With my current understanding of the software/firmware architecture, I believe this could be implemented using the trigger mechanism built-in to IIO and some FPGA ref design modifications (if it's not or you have a better idea, please let me know). I am confused on how to get there, however.

With my current understanding of triggers with IIO, a device either has trigger capability, or doesn't, and this can be probed from command line with the device in question. Triggers have a limited number of variations (threshold, rising/falling edge), and trigger events can initiate certain events, in this case data capture.

What I am confused about is how to set up a trigger event to do what I would like to do. How I think I could modify the existing SOM firmware/software is to change the HDL design by adding a custom trigger. When a Tx waveform sent to the DMAC is finished, the trigger activates. The trigger will let the Linux side of things know that Rx data is incoming, and to setup buffers/channels accordingly, and either store the data or offload it via Ethernet to a connected PC for storage (this is moot at the moment, I just want a single Tx-Rx time-coordinated pulse/data). Buffer size and other IIO parameters would be modified to handle my expected application requirements. Since sample rate/size is known as well as the receive time, I would set up buffers appropriately for the amount of data I want to get. For example, at 100Msps, 16bit ADC representation, for 1 second: 100Msps *(16bits)*(1 byte/4bits)*(1 sec) = 400Mb of data.

My questions are:

1) Does this seem like a good approach? Or is this functionality already existing (or similar functionality) and I have just missed it?

2) How much overhead time would you expect there to be between setting the trigger, and subsequent setting up of the Rx buffers/channels for such a setup? On the order of a couple hundred nanoseconds, or are we talking microseconds plus?

3) Where is the best location to implement this functionality? I was thinking of adding to the JESD204B Receive Peripheral Linux Driver, is this reasonable? I would route the trigger to a GPIO into the microprocessor.

4) What would be the best way to register the end of the Tx waveform? Have a unique "end of transmit" sequence and poll the DMAC data for that in the HDL, or...?

Thank you,


  • +1
    •  Analog Employees 
    on Nov 15, 2019 6:11 PM 11 months ago

    So triggers are a feature of IIO drivers themselves and are implemented by the driver architect. Unfortunately, the transceiver drivers do not include triggers.

    Technically you could add triggers to the drivers but I would say it will probably over-complicate your design and maintenance in the future. Since you would have to patch your driver when ADI makes changes.

    For your applications you really just need a way to provide control on when receive events happen in relation to transmit events. A simple way to do this is to tightly control when data comes out of the TX DMA (or your signal source) and when data is allowed to enter the RX DMA. One way you could accomplish this is by adding an IP core that is connected to the TX DMA (or your signal source) to backpressure it and to the RX DMA's valid and sync input pins. Then have a simple driver that controls this IP to configure or control the relationship between these signals.

    From software, you could interface with your IIO buffers as you normally would but would just have to set up things in a certain order. For example, a reasonable flow would be:

    0. Configure controlling driver to backpressure TX and not let data into RX DMA

    1. Set up the receive buffer to be ready for a capture of a specific length

    2. Create, fill, and push a transmit buffer

    3. Trigger the custom IP to stop backpressuring TX and let data into the receive buffer

    4. Pull the captured data from the receive buffer

    5. Repeat from 0 for more data.

    Hopefully, that makes sense.


  • The concepts are crystal clear, Travis, thank you. I have a good amount more experience with FPGA design than I do embedded C/Linux, so hearing that more can be done on the HDL side of things gives me some relief.  This approach makes more sense and is simpler than what I was picturing with triggers.

    Thank you for your insights.