Q.
The typical supply current on the ADL5902 is specifed as 73 mA and 90 mA at 25 degC and 125 degC respectively. What is the worst case value?
--------------------------------------------------------------------------------------------------------
A.
The attached plot gives a sense for the part-to-part variation in supply current.
These numbers do not exactly line up with the data sheet typicals for 25 degC or 125 degC. The sample here is a smaller population of devices compared to that used for during full device characterization. As a guideline, if you are planning to use the device up 125 degC, I would suggest budgeting for at least 100 mA for the ADL5902.
yes, is correct.
Not really. ADL5902 is more stable vs temperature and is also rms-responding. If you put two different signals into AD8307 (e.g. two signals with different modulation) with the same rms power level (say -20 dBm), you get two different output voltages from the AD8307. An rms detector such as ADL5902 would give you the same answer in both cases.
I assume that you are digitizing the output voltage from the detector. So conversion from dBm to Watt should be easily implemented via software, right?
I read an article about a RF digital wattmeter built with an AD8307 and seem very good. Is there yet? Is better than ADL5902?.
Could serve for my application maybe?
I plotted your data. You error calculations look correct but I think that their polarity is inverted. But that is not important. Looking at the data, I believe that some of the error is due to the ADL5902 and some is maybe due to inaccuracies in your signal source. If you look at the ADL5902 datasheet plots, you can see that between 0 dBm and -40 dBm, there is some nonlinearity in the transfer function. For example in Figure 3, you see that the error is -0.5 dB at -20 dBm if you calibrate at 0 dBm and -45 dBm. So I think that this partially explains the dip you are seeing at around -20 dBm. However, I suspect that the ripple that takes the error at -25 dBm back up to around 0.2 dB may be due to you signal source.
If we put this question of source inaccuracy aside and consider you question about how to improve error, the best option is to add more calibration points. So for example, if you calibrate at say, 0 dBm. -20 dBm -45 dBm and -60 dBm, you will reduce some of the natural nonlinearity of the ADL5902. The downside of this is that more calibration is required (so you need to collect four Pin and four Vout values and calculate and store three slopes and three intercept values. Then when the system is in operation, you need to retrieve the appropriate slope and intercept pair based on which region you are in.
Another option to consider is ADL5906. This device has less fundamental non-linearity in its transfer function. It's pin-compatible with ADL5902.
Below you will find the last values from ADL 5902 RF detector from here it observed that error ( dbm) was reduced .
However , I will use the RF detector for measure in watts. For example if convert 1000 watts to dbm are 60 dbm, an error of + 0,5 db ( 60,5 dbm) produce 1122 watts , from here the error in watts is too high (122 watts , 12%). Is possible reduce more the error en db for that the percentage in watts be the least?
Thank you
Hugo