Hello,
I'm using the ADRV9003 on a custom board and I have some questions regards quantization and analog noise.
Data port sample rate: 24 kSps.
Interface rate: 24kSps.
RF Bandwidth: 12kHz.
16bit IQ.
AGC (Automatic)
Slicer operation (Automatic)
Compensated/Corrected gain (Tried both)
I tried to measure the RSSI when 50 ohm at the input of the ADRV9003.
I set AGC mode, the gain index is constant 255 because the signal power is low.
With the API function I got RSSI ~108dBFS, which means 8.6dBm - 108dB = -99.4dBm.
Does it make any sense? Is the RSSI value should be lower when no signal applied at the input?
I guess the RSSI measure over the defined BW, which is 12kHz for my application.
The noise as mentioned in the datasheet, is -142dBm/Hz. For 12kHz, it's -142dBm/Hz + 10log_10(12,000) = -101.2dBm.
The difference here is 1.8dB, is it acceptable?
When I observe the IQ data, I see that only 1 LSB is flipping and the Slicer is +18dB.
It seems like it is the quantization noise, but as you mentioned in the datasheet, analog noise should be greater than the
quantization noise (when slicer is +18dB).
If I change the slicer to be lower (+12, +6...) the IQ data change to complete 0.
Is there anything that configured wrong?
I expect to see the analog noise no matter what, especially when the slicer is +18dB.
Thank you in advance.
Avner







