I'm currently using the ADI IIO Oscope application on the AD9361 and I'm attempting to convert the RX 1 RSSI value an an actual dBm value on the RX 1A input of the AD9361 by injecting a signal level of a known value and using the RSSI value to determine a correction factor that I would use in my calculations. However, when I increase the level of the input signal, the RSSI (dB) value displayed in the RX 1 Receive Chain box decreases as input signal level is increased. This is the opposite of what I expected.
Shouldn't the RSSI value go up and down as the RX 1 input signal goes up and down, respectively? Or is the RSSI being displayed on in the RX 1 box on the ADI IIO Oscope GUI really representing the hardware gain and associated RSSI for the internal AGC circuitry?
Am I not following the correct method for determining the correction factor for the RSSI value reported from the IIO Oscope GUI vs. the actual signal input level being received by the AD9361?
Any feedback would be appreciated and I look forward to your response.