Post Go back to editing

AD9106: CAL_CLK_DIV and BGDR bits


I am working with the AD9106 for a new product line. According to the datasheet, the BGDR adjusts the reference voltage value by means of six bits [5:0]. However, there is not a clear relation between the binary code and the final voltage value applied.

The same happens with the CAL_CLK_DIV bits. Here the clock divider value is setting by means of three bits [2:0], but the relation between the binary code and the final value applied is confuse. This remains confuse in the LabView software provided to work with the EVAL AD9106 board: here there are only four eligible values (CLK/32, CLK/64, CLK/128 and CLK/256), and none of them allow to accomplish with a frequency of calibration less than 500KHz if CLK=180MHz, which is a constrain given in the datasheet. 

Best regards,

  • Hi C.J.,

    The BGDR code adjusts the reference voltage by adding/subtracting up to 20% of the nominal voltage output of the REFIO pin. The nominal voltage is approximately 1V and is referenced as BGDR = 0x00 (zero code). Since the BGDR code is in two's complement mode, the first half of the whole range of the 6bits (0 to 31 in decimal) adds up to 20% of the zero code value (0.625% of zero code for every bit increment). For the other half (32 to 63), it subtracts the zero code with the same values. Figure 37 of the datasheet makes visualization easier.

    For the CAL_CLK_DIV bits, have you already tried to test the auto-calibration using the four options (CLK/32, CLK/64, CLK/128, and CLK/256)? I think there's a discrepancy in the options at the LABVIEW software since the default divider is 512 (based on Datasheet). Kindly update me on what happens when you choose one of the options while we're looking into it.



  • Hi there,

    Any feedback about the CAL_CLK_DIV bits issue?


Reply Children