Post Go back to editing

ADRV9001 Data Rate and Bandwidth

Category: Software
Product Number: ADRV9001
Software Version: ADRV9001 no OS Master

Hello,

I'm using ADRV9002 with Zynq MPSoC. I used HDL reference design and "no_os_master" example application. I generate a profile with TES GUI which is 61.44 MSPS and 10 MHz BW. When I use this configuration with Eval Board ADRV9001, Data Rate and BW are correct. But I also use the same configuration with my custom board and when I use my custom board, Data Rate and BW are the half of the expected values. For the custom board, I only disable RX2-TX2. Other things on HDL Reference Design and application code, are the same as Eval Board. What could be the reason for that? In my opinion, I may not configure LVDS Lanes as DDR. But for eval board, configuration works as expected. Do you have any opinion to check about application code?

Parents
  •   , how about the clock chip ? what clock chip do you use and are the outputs of this clock chip identical to the outputs of the eval board clock chip ? 

    I'm talking specifically about DEVICE CLK and FPGA CLK

  • Hi,

    I'm using 38.4 MHz clock for both DEVICE CLK and FPGA CLK. I prefer same frequency for custom board. For HDL Reference design, I found that "dac_1_enable_i0/i1/q0/q1" which are output of "axi_adrv9001" IP Core, should be "1" when I started the DMA application. But only "i0/q0" are logic "1" after DMA function starts. For the eval board all of the signals are logic "1". I couldn't find the reason. I only disable RX2/TX2 channels. After that I disconnected i1/q1 signals and use i0/q0 for both inputs which are used for "util_upack2" IP Core. BW and Data Rate become correct after that change. But, I couldn't find the reason why this is different for custom board HDL reference design and example application

Reply
  • Hi,

    I'm using 38.4 MHz clock for both DEVICE CLK and FPGA CLK. I prefer same frequency for custom board. For HDL Reference design, I found that "dac_1_enable_i0/i1/q0/q1" which are output of "axi_adrv9001" IP Core, should be "1" when I started the DMA application. But only "i0/q0" are logic "1" after DMA function starts. For the eval board all of the signals are logic "1". I couldn't find the reason. I only disable RX2/TX2 channels. After that I disconnected i1/q1 signals and use i0/q0 for both inputs which are used for "util_upack2" IP Core. BW and Data Rate become correct after that change. But, I couldn't find the reason why this is different for custom board HDL reference design and example application

Children