I am using Analog Devices AD603 Variable Gain amp in a HF receiver There are 2 in the IF amp providing both
gain and AGC. I run the amp from a single regulated 10V supply. The V- pin is connected to the +10V return. The common terminal is biased at 1/2 the supply voltage.
From the data sheet the only constraint on the gain control voltage is that it be kept within the common-mode range (−1.2 V to +2.0 V assuming +5 V supplies) of the gain control interface.
The question involves the Gain control interface. On the spec sheet, looks like an op input. The spec says the common-Mode range is
-1.2 to +2.0V. The agc operates sequentially, the output stage gain is decreased first then the input. The AGC control voltage
range is 1 V at each stage. So the bias (or AGC Reference) on the output amp is at +5V and input stage is 6 V. As the Agc voltage
goes from 4.5 to 5.5 the output stage gain is decreased and as it increases from 5.5 to 6.5 the first stage gain decreases. This works
as designed. I get over 80 dB of AGC control. This works well, but but for voltages below approx 4V and above 7V I see some strange results, like maybe breakdown or something occurring that causes circuit not work work correctly. The no-signal input to the AGC amp is suppose to be in 4.2 – 4.4 V.
I don’t quite understand the common mode range spec. I assumed I could set the AGC reference voltages between say 2 – 8V, staying at least 2 volts from ground and Vc. The ACG voltage would swing over a 2 V range. Is this impacted by the single supply voltage? Most of the examples use +/-5 V for supplies. Is this a limit on the differential between AGC amp input pins and also timed to a issue with a single supply.
If Referenced to the common terminal (Gnd) these inputs will set several volts above gnd. and differentially the inputs could see voltages that exceed the common mode spec.