Post Go back to editing

# AuxiliaryADC decimation clock and resolution

Category: Hardware

Doubt 1 -

AuxADC Clock Frequency is set to 1024Mhz and AuxADC decimation to 1024. We are not able to understand the derivation of 3.906Khz clock at CTRL_OUT0 Pin (AuxADC decimation filter clock). What is the calculation behind it. Kindly elaborate

Doubt 2 -

According to the datasheet the AuxADC is a 12bit ADC. According to my understanding :

the resolution should be equal to = (1.3-0.05)V(Max input allowed) / 2^12 =   0.305mV

But, during lab testing we are seeing a offset as high as 0.2V between the input signal strength and read back value form the AuxADC (0x01E and 0x01F).

Is there a need for calibration or any other offset input to ADC ?

• Moving this thread to  .

• the resolution should be equal to = (1.3-0.05)V(Max input allowed) / 2^12 =   0.305mV

Yes, the resolution calculation is correct.

during lab testing we are seeing a offset as high as 0.2V between the input signal strength and read back value form the AuxADC

AuxADC has non linearities and offsets, and also depends on the input voltage. So, you need to do a one time factory factory callibration to compensate for the offset in your readings.

• Thank you for the reply to the Doubt 2:

Kindly clarify the below doubt 1 also:

AuxADC Clock Frequency is set to 1024Mhz and AuxADC decimation to 1024. We are not able to understand the derivation of 3.906Khz clock at CTRL_OUT0 Pin (AuxADC decimation filter clock). What is the calculation behind it. Kindly elaborate

• What is your requirement? Why do you want to monitor the decimation clock?