I have a PCB designed that allows me to use either a 3.3v or 5.0v supply on the LT5537 devices on my board. I have been using the 3.3v supply for a year, but have recently found a spur on my 3.3v supply. I would like to change to the 5.0v supply, but I can't find any information about how the detector transfer response changes as a function of supply voltage. I ran a test with each supply and saw a rise in detected voltage of approximately 10% over the frequency range of 3 to 35 MHz and with a power level from -60 to 0 dBm. Is this expected?
I put 120 ohms in series with the 5.0 supply, to provide a 3.3v supply after the voltage drop created by the 15mA supply current. This brought the performance back to the original performance with the 3.3v supply coming directly from the voltage regulator. Is there any downside to running the detector with a series resistor on the supply line in this manner? The resistor was placed prior to the 0.01uF and 1nF bypass capacitors, so they are still up against the supply pin. But I'm unsure if the increased source resistance in line with the 5.0v regulator (an LT3045) would create any problems.
Here is the data I measured in the three conditions, in .csv format
Freq (MHz),P (dBm),3.3v,5.0v,"Delta, 3.3 vs. 5.0","3.3v (5v, 118 ohm series)"
3,-60,0.71,0.80,13%,0.72
3,-40,1.07,1.16,9%,1.07
3,-20,1.46,1.56,7%,1.47
3,0,1.83,1.93,6%,1.84
6,-60,0.72,0.81,13%,0.72
6,-40,1.08,1.17,9%,1.08
6,-20,1.47,1.57,7%,1.47
6,0,1.83,1.94,6%,1.84
14,-60,0.73,0.82,12%,0.73
14,-40,1.09,1.19,9%,1.10
14,-20,1.48,1.57,7%,1.48
14,0,1.84,1.94,6%,1.84
22,-60,0.73,0.82,12%,0.73
22,-40,1.09,1.19,9%,1.10
22,-20,1.48,1.58,7%,1.48
22,0,1.84,1.95,6%,1.85
32,-60,0.72,0.81,13%,0.72
32,-40,1.08,1.17,9%,1.08
32,-20,1.47,1.57,7%,1.48
32,0,1.84,1.95,6%,1.85