I am not understand from AD9874 datasheet how possible calculate absolute signal strength in difference mode(without VGA, with VGA/DVGA, with enable AGC ) if I know value of ATTN and SSI field. Can anybody write formula for calculate it for difference mode?

Hello,

Let's assume that only the VGA is used without AGC hence one can set the max input level into device (IF clip point which is

-2 dBFSmeasured on ADC FFT data. Note, driving the input level beyond this level results in potential unstability of the ADC.The SSI field is a linear estimate of the receive input signal strength with value 60 being max value................corresponding to 0 dBFS on an ADC complex FFT.

According to datasheet, "IF clip point" is -31 dBm for minimum VGA attenuation and -19 dBm for maximum VGA attenuation. Thus, a unmodulated sinusoidal input signal of -29 dBm and -17 dBm would result in a 0 dBFS signal (on FFT assuming no instablity in sigma-delta ADC) and a code of 60 on the SSI field. Based on this understanding, an estimate of the signal level would be as follows:

PIN_dBm = VGA ATTEN_dB + (-31 dBm + 2 dB) + 20*log10(SSI_Field/60)

With AGC enabled, one can always read back the AGC field via the SPI to determine what setting the AGC has set the VGA for. Refer to Figure 19 of the AD9874 datasheet.

A linear estimate of the received signal strength is performed at

the output of the first decimation stage (DEC1) and output of

the DVGA (if enabled) as discussed in the AGC section. This

data is available as a 6-bit RSSI field within an SSI frame with

60 corresponding to a full-scale signal for a given AGC attenuation

used with the 8-bit attenuation field (or AGCG attenuation

setting) to determine the absolute signal strength