AnsweredAssumed Answered

rfft on TS101 board

Question asked by davidstites on Aug 6, 2010
Latest reply on Aug 11, 2010 by jeyanthi.jegadeesan

To get the required helpful diagnostic information out of the way:


I am using a custom board TS101 and Barracuda board from Bittware.

I am using VisualDSP 5.0 (build version

I am using the latest toolkit from Bittware.


To preface my problem, I have very little DSP experience so please try to be patient with me if I post information that is extremely simple or irrelevant for you or I don't understand something that you post.


I have my code mostly working.  What I am trying to do is analyze a particular RF signal and I am down-sampling the RF from anywhere between 2-18 GHz.  Through a chain of several different boards, the final IF signal that goes into the Barracuda board is at 16.25 MHz over two different channels.  However, due to the limited bandwidth on the DRS 9136A tuners the signal goes through, we can only see 25 MHz at a time, so to get to the desired bandwidth of 90 MHz, we have to "sweep" the tuners.  I am collecting the data from a complete sweep and running it through the standard VisualDSP 5.0 RT library functions. 


When I run my FFT using the rfft call, I am trying to figure out why I am seeing something that I don't expect (code down below).  I am essentially trying to figure out why the data coming out of the rfft function call isn't completely flat, even when there is no input coming into the board.  I am seeing the same type of problem whether or not there is input present on the board or not.  However, the end result is working, that is, I am seeing the correct spectral image graphed in my Java application.  The problem comes when I try to "stitch" my windows together to get the full spectral image.  There is a bit of roll-off of the signal on either side of the bandwidth of 9136A, so I am trying to eliminate that in my final image so that the graph doesn't jump around a lot "between windows". 


Here is a small code sample where I do some FFT processing.  Normally, my current v_currentSampleSize is 2048 and my MAX_SAMP_SIZE is 4096.  However, when I look at the data that comes out of the FFT, even with no input, it is rather odd (see attachment for screen shot).  My FFT_temp, FFT_out, FFT_twiddle are all 4096 points and I am transferring in 2048 points with each sample.


void create_spectral(void) {
  register int i;
  register int numSpectral;
  int startIndex;
  int status;
  int endIndex;
  float fVal;
  float fVal_i;
  float fVal_r;
  endIndex = v_currentSampleSize * 2;
  numSpectral = 0;
  // convert the integer values to float.  Float is required as input 
  // to the FFT routines.
  for (i = 0; i < endIndex; i++) {
     g_spectralRawData[i] = (float) v_iFFTinputPtr[i];
  rfft(g_spectralRawData, FFT_temp, FFT_out, FFT_twiddle, MAX_SAMP_SIZE/v_currentSampleSize, v_currentSampleSize); 
  //rfft(g_spectralRawData, FFT_temp, FFT_out, FFT_twiddle, 1, MAX_SAMP_SIZE); 
  // do the peak search slightly wider than the range looked for data points in
  for (i = 1; i < v_currentSampleSize; i++) {
     fVal_r = (float) FFT_out[i].re;
     fVal_r = (float) fVal_r * (float) fVal_r;
     fVal_i = (float) FFT_out[i].im;
     fVal_i = (float) fVal_i * (float) fVal_i;
     fVal = fVal_r + fVal_i;
     if (fVal > spectrum_peaked[numSpectral]) {
          spectrum_peaked[numSpectral] = (float) fVal;


Based on the attached screen shot, I would expect to see just noise straight across the graph, rather than starting low and going high.  Note that this screen shot is a graph of the data points in the 'spectrum_peaked' array (1024 points).  Is this correct?  I was playing around with the size of the twiddle table and the twiddle stride and it does affect what we see (it is flatter) but it doesn't work as well with the stitching.