I have an evaluation board (model PCB 105809) for a Hittite HMC439QS16G digital phase-frequency detector.
The board behaves in a different way than specified in the data sheets and I would like to ask what can be the problems.
In particular, I'm testing it by sending two input signals, LO and RF, one of them at 5 dBm and the other one at 7 dBm. LO is at 350 MHz, while I can change the frequency of RF from about 250 MHz to 500 MHz. I have a frequency tuning resolution of 1 kHz so I can also set RF to be 350.001 MHz. My problem is that NU is always at 5 V while ND oscillates between around 3.5 V and 4.5 V. These levels are independent of the frequency difference, while the period of the oscillation of course depends on it.
My questions are:
- shouldn't the two outputs switch at some point? In the sense that NU should start to oscillate while NU should stay at 5 V?
- Shouldn't the oscillations be between 3 V and 5 V? In principle, because of the frequency difference, the phase should be swept by 2Pi.
In order to further understand the problem I have switched the two inputs, LO and RF and I saw that in this case both NU and ND are at 5 V and there is no oscillation at all. Does this mean something doesn't work?
I have also noticed another thing that seems weird to me. The current that the board absorbs from the power supply depends on the load resistance I set on the oscilloscope on which I observe the outputs NU and ND. In particular if the load resistance is 1 MOhm I see the signal levels I mentioned before and the current going to the board is 87.9 mA, whereas if I set a load resistance of 50 Ohm for both the outputs, the current the board absorbs becomes 155 mA. Note that the board has two 5 V supply terminals and this is the global current flowing to them. Why does this happen?
Thank you very much.