I'm going to use the AD8302 chip to measure the phase difference between voltage and current at the output of the radio transmitter.
I found in the ABSOLUTE MAXIMUM RATINGS that the input signal level should be no more than 0.707 V.
Is this demanding strict?
Can I, for example, limit the input current or ensure that the input voltage is not accustomed to the supply voltage?
Are there any examples and recommendations how to protect the inputs of the AD8302 in real hardware?
I looked through different examples and nowhere were there any protective elements.