AD7606C Max Input External Resistor Value

Hi, I want to measure 50V differential voltage and I should decrease the voltage to ADC's input voltage range. Is there any consideration for determining resistor values of external resistor divider? I think input impedence of the system should be high as possible but too high value will add noise to system and if there is a minimum current requirement for AD7606C, my system may not work. What is your reqirement? precision resistors or maybe external opamp?

  • 0
    •  Analog Employees 
    on Sep 16, 2021 12:03 PM

    Hi obione,

    Using a series resistor to increase the range should be ok, as long as the voltage and current at the AD7606C pins no overstress the Absolute Maximum Ratings table.

    Regards,

    Lluis. 

  • Thank you Lluis. If I correct understand, I can use two resistor series to ADC input impedance. However, AFAIK ADC input impedance not stable at 1.2M so, ADC input voltage can vary. Also I should use bigger resistor values but they will add noise to system, am I right?

    divider

  • Is there any answer? I think too high value resistance generates error due to ADC input current but I'm not sure. How should I reduce the voltage level properly?

  • 0
    •  Analog Employees 
    on Sep 28, 2021 2:44 PM in reply to obione

    Hi obione,

    They may add noise and will contribute to the gain drift, but still will be a valid option. Another option is to use a resistive voltage divider. I am afraid I don't have bench results to show you the performance you could achieve, so I would recommend purchasing an evaluation board and perform the experiments that would best serve your needs.

    In any case, your 2.4Meg resistor would scale a 30Vdiff signals down to 10Vdiff analog input range.

    To scale a 50V diff signal down to the 20V diff, you would need an external resistor 1.5x Rin.

    Regards,

    Lluis.

  • Thank you for suggestion. In data sheet, there is a graph that shows THD variation with different source resistor values. I want to use a resistive voltage divider as you said. I'm going to use 200k-100k-200k values so, source resistance will be around 100k. This means my THD value will be too increased. I want to minimum 500k input impedance but I'm also worried about the THD value. What should I do? For instance, if I use oversampling or digital filtering, my THD value can be acceptable? 

    Also I have a related question; 

    There is the min 1Mohm to typical 1.2Mohm values in datasheet page 6. According to note-4 and system gain calibration settings; If I correctly understood, this 1-1.2 Mohm range is changing with gain calibration setting. So, 1.2Mohm is maximum impedence value and if I don't use gain calibration feature, typical value of input impdence is 1.2 Mohm. This value is factory trimmed and it can be changed max 25ppm/C.

    I'm going to use a resistor divider so, I should know input impedence behaviour. I will calculate resistor values for 1.2 Mohm input impedence. And maybe I will use gain calibration feature for balancing resistor divider. Do you think everything is ok?