I'm trying to find out how to adjust the AGC on the AD9361 in order to reduce signal reduction due to noise.
Presently, without noise the signal is approx 1/4 (-12dB) of full scale output, which is the goal I wish to maintain when noise is introduced. When I use an EbN0 level of 10 dB (approx 5dB Signal/Noise), the AGC reacts to the noise peaks and reduces the gain by approx 15dB, thus putting the signal level at approx 27dB below full scale. I would strongly prefer to clip the noise peaks and keep the signal as close to 12 dB below full scale as possible. I looked through the AGC documentation and I have no idea what I should adjust to achieve this goal.
Could you advise me how the modify the AGC settings to reduce the noise induced signal compression?
Current settings :
Signal BW : 100KHz (single sided) Due to our transmitter req's, we are operating with the 20MHz clock and our logic accumulates the Rx samples into a slower rate data stream.
IF BW : 300KHz Set by the AD9361 analog Butterworth filter.
RF signal level: Effect happens over wide range of RSSI
AGC settings : default settings w/ mode = 1, 2, or 3
Script sent to AD9361 using the Analog Devices Driver app
#,"Command (r,w)",Address (hex),Data (hex),Delay (ms),Comment
1,aw,1,230000000,100, tx frequency
1,aw,2, 20000000,100, data rate
1,aw,3, 5000000,100, tx bw
1,aw,4, 12000,100, tx attenuator
1,aw,5, 1,100, tx FIR
1,aw,7,210000000,100, rx RF frequency
1,aw,8, 20000000,100, rx sample freq
1,aw,9, 230000,100, rx bw
1,aw,10, 3,100, rx agc mode (hybrid)
1,aw,11, 90,100, rx rf gain
1,aw,12, 1,100, rx FIR enb
1,aw,13, 1,100, RX1, rssi
1,aw,14, 0,100, loopback bb