Post Go back to editing

Driving an AD5663R with a BF547 SPI Bus

I am controlling AD5663R D to A Converters using the SPI bus on a BF547 processor. I find the amount of time to write one output code to the AD5663R is slower than I expected and need.

The connections I have now are:


The AD5663R requires 24 Bits to be sent while its ~SYNC line is set low. Between every 24 Bit send the ~SYNC line is required to go high. The most the BF547 hardware can do is send 16 bits while the SPIxSEL1 is low. On my next board design I will connect the SPIxSEL1 to a TACLKx input and the TMRx output of that counter to ~SYNC. For now I find that it works to, after opening the SPI driver, open the Flag that shares the SPIxSEL1 output, so the ~SYNC line can be controlled with the Flag, while the rest of the interface is still controlled by the SPI driver.

I find the Flag driver to be very fast, and a call to it consumes an insignificant about of time compared to the adi_dev_Write() statement, and so the Flag, besides serving to drive the ~SYNC line, is also useful to measure how long it takes to execute the adi_dev_Write() statement. I have the SPI bus running at its fastest setting, with a  ADI_SPI_CMD_SET_BAUD_REG setting of 4. ADI_DEV_CMD_SET_SYNCHRONOUS is set FALSE. I find it is taking 50us to execute the adi_dev_Write() statement to send one 24bit command that writes one output code. Where cclk is 500Mhz and sclk is 100MHz is this a normal amount of time? Or could something be wrong? The need is to continuously update the AD5663R's output code every 5us to 10us.

I am currently using the DMA driver. Would the Interrupt Driver be faster?

I am currently working to combine the two buffers given adi_dev_Write() into one. I not expecting this to save enough time.

  • Hi,

    For the AD5663R operation, ~SYNC has to go high after every 24-bit data transmitted. Since you are using DMA mode for these transfers, are you configuring the DMA for one word transfer? In that case, core mode would be better to use(Interrupt driver). Also, BF547 can be configured for a maximum SCLK of 133MHz. Is there any reason why you are limiting it to 100MHz?

    SPICLK can have a maximum of SCLK/(2*BAUD). You are setting the BAUD as 4 which means SPICLK=SCLK/8. You can set ADI_SPI_CMD_SET_BAUD_REG for a value of 2 for highest SPI speed. Are you completely relying on adi_dev_Write() for calculating the execution speed? This function writes data to the device and this may not give the actual transmission rate. Is it possible to probe the SPI signals? That would be a more reliable way to find out the execution time.

    Thanks and Regards,




  • I am doing single byte transfers because it is not possible to change between 8 bit and 16 bit transfers in the buffer chain, and 24 bits are required. And now, for speed, I am using a single buffer for all three bytes.

    I am using 100MHz because I also have an AD7606 A to D converter on EPPI1. Maximum rejection of both 50Hz and 60Hz requires a sampling interval in multiples of 100mS. 100MHz is more easily divisible to do this, and so enables a greater variety of possible sampling rates within the 100ms multiples.

    What limits the SPI speed are transmission line effects. The SPI line at those frequencies behaves like a transmission line. Without impedance matching resistors there is excessive oscillations and cross talk between the sync line and clock, which I have seen on an oscilloscope trace. I would like to have ADI_SPI_CMD_SET_BAUD_REG set at 2, and perhaps in the future with a better board design I will be able to. Now the impedance matching resistor and the capacitance of the line creates a filter that attenuates too much at settings lower than 4. I might get a little more speed if I did a lot of tweaking with the impedance matching resistor, but I do not have the development time to do that.

    Since starting this thread I have tried the interrupt version of the driver. I find that adi_dev_Write() returns much sooner than with the DMA version of the driver. This indicates to me less processing time is required, as you expected. And since, unlike with the DMA version, it actually returns before the transmission is complete at a ADI_SPI_CMD_SET_BAUD_REG setting of 4, I will be able to get a faster update rate. I have noticed the time between the transmission of the bytes is greater with the Interrupt version, but I can live with that.

    I have also noticed, with the interrupt version, the SPIx_SCK goes to a high impedance state between the 24 bit transmissions, so I had to retrofit a 10k pull up resistor on this line.

    The execution time of adi_dev_Write() is measured by measuring the length of the pulse at SPIx_SEL1 on an oscilloscope. The Flag that controls SPIx_SEL1 (not technically SPIx_SEL1 because the Flag driver makes the mux connection to the Flag after it has been connected to SPIx_SEL1 by the SPI driver) this is set low just before the adi_dev_Write() call, and set high again right afterwards.

    In normal operation, to make the Flag set the SYNC line high at the right time, I enable a Timer which is set to post a semaphore in its callback at the right time. It is enabled right after setting the Flag low and just before the call to adi_dev_Write(). The next statement after the call to adi_dev_Write() is a pend on that semaphore, followed be setting the Flag high.

    It is unfortunate there is no configuration option to automatically keep SPIx_SEL low for 24 bits.

  • Hi,

    You can make use of the software control mode for 24-bit SPI transfer. You can find a similar discussion in the below link which can be helpful to you. There is a sample code also available here posted by one of my colleagues.



  • This question has been assumed as answered either offline via email or with a multi-part answer. This question has now been closed out. If you have an inquiry related to this topic please post a new question in the applicable product forum.

    Thank you,
    EZ Admin