RMS Detection is Frequency Dependent?

I'm trying to sense a sudden change in impedance with the use of a LTC6268 transimpedance amplifier and an RMS averaging component. So far I've tried the ADL5511 and AD5904. These both work as intended, however only at frequencies upwards of 10MHz. I'm sending 300kHz across my input impedance (~30k ohms) with a ~50ohm variance and into the transimpedance amplifier which then feeds to the truPWR RMS detector. Basically, the thought process is as follows: measure the AC current across the variable impedance, convert it to an AC voltage with the transimpedance amp, then convert it to a DC signal with the RMS detector. I'll monitor the output of the RMS detector for a sudden change. But, I'm having trouble detecting the dip with the RMS detector at anything less than 10MHz with either board. Modifications were made to the ADL5511 in order to reduce the frequency with no improvement. Neither of the RMS averaging chips respond with a changing impedance at 300kHz. But, when running at 25MHz my impedance change shows up beautifully, and becomes even more apparent at higher frequencies. RMS value should be independent of frequency.  What am I missing here? Why do these boards require such high frequencies to see changes in impedance?

Parents Reply Children
No Data