AD7765 128x Decimation Not Working

I have a board that is daisy chaining 4x AD7765's and have it happily working in DEC_RATE = 0 (256x), BUT I cannot get 128x to work correctly.

Q1: What level do I set DEC_RATE to 128x? Do I drive it high or set it floating?

Here is a trace of it running in 128x where I have driven DEC_RATE = high

The bottom 4 signals contain the 4 channels parallel data where the LS status byte is set to 98'h

From AD7765 data sheet:

4544.9716be987934bc7f0cdb10ee49eb7397.html

8713.9716be987934bc7f0cdb10ee49eb7397.html

FILTER-SETTLE = '1', DEC_RATE 1 = '1', Don't care = '1' = 98'h

So, the 4x ADC's appear to think they are in 128x decimation.

Q2: Why is the NFSO timing incorrect? (i.e. the same as 256x decimation) Unless I misunderstand, NFSO should be running at double the rate. (i.e. no gap between CH1-4 sample bursts)

Note: If I set DEC_RATE to float I get 90'h (i.e. Don't Care = '0')

Thanks for any help!

  • 0
    •  Analog Employees 
    on Feb 26, 2018 9:01 AM

    Hi Alaney,

         To run the AD7765 to 128x, it should be connected high or DVdd. What is the frequency of the master clock? The ODR must be faster at lower decimation rate (128x). What is the decimation used in the picture above?

       Please do not confuse the status bit register for the decimation to the Dec_Rate pin setting.

    Regards,

    Jonathan

  • Are there any Analog Devices support staff able to look into this for me please? (I posted this Friday morning)

    Or am I better off raising a case to get support?

    Other things tried since last my post:

    From an old post on this forum, I have ensured the following:

    1) NSYNC and NFSI both set high during NRESET transition to high.

    2) When active, NRESET and NSYNC are set over 100s of MCLK cycles

    3) I have checked for adequate setup/hold times related to MCLK edges on NRESET and NSYNC signals.

  • Hi Johnathan,

    Thanks for the reply!

    My master clock is 31.5Mhz, and the picture above is setup for 128x decimation with the DEC_RATE pin set high. I am a bit confused by your last sentence.. Is the "DEC_RATE1" status bit not reflecting what I have the DEC_RATE pin setup to?

    With regards to ODR, when I compare a trace with the ADC setup for 256x vs 128x, they look exactly the same timing wise. The only difference is the value output on the "DEC_RATE1" status bit.

    I am correct that in 128x there should be no gaps between ch1-4 data bursts with NFSO occurring at double the frequency?

    Regards,

    Alaney

  • Right, I have it working now! And just in case someone in future has a similar problem I will explain the root cause...

    There was one issue I was aware of on my board, but was waiting for it to be modified and was working on the assumption this would not fix the issue. The problem I had was the FPGA bank connected to the ADC was setup for 1.8V instead of 2.5V.

    Your comment on "Don't confuse the status bit register bit "DEC_RATE1" for the DEC_RATE pin...."  was interesting because that is what made me think the change to 2.5V would not fix the problem since I assumed the status "DEC_RATE1" reflected the ADC being set into 128x. I guess on chip the DEC_RATE pin is fed into 2 modules where the module that generates the "DEC_RATE1" status is not level sensitive to 1.8v signals while some other module is?

    You could argue that this is not a particularly fault tolerant ASIC design, but at this point I don't really care.