Post Go back to editing

How does the CMRR of difference amplifier ICs change over time?

This is a question I have recently asked at (here is a link), but without a good answer so far. I am reposting it here because I settled on using the AD8276 now, and it has been suggested that I try asking the manufacturer directly.

When designing precision analog circuits, I often come across parts which seem to be more than accurate enough for my purposes, but where the datasheet does not specify how key parameters will change over time.

Right now, I am looking at datasheets for difference amplifiers, and the CMRR looks better than what I would achieve by using affordable matched resistor dividers (e.g. the MAX5490). However, the resistor ratios will drift over time, which will reduce the CMRR.

Resistor dividers often give a typical value for this drift in ratio, so I can estimate how long my circuit can go without recalibration. However, while some of the difference amplifiers I saw specify input offset drift over time, I didn't see one yet which specifies the change in CMRR or the resistor ratio matching over time.

I'd assume that the parameters won't drift much beyond the initial limits over time, and this seems to be true e.g. for the offset voltage of many op amps, but on the other hand, I remember seeing 0.1% resitors which were only specified to drift less than 2% (or something along that magnitude) within a few thousand hours.

Now I'm wondering: Is there some rule of thumb for estimating how the CMRR (or similar parameters without aging specification) will develop? Can I assume that it will remain above the "minimum" specification even after some years of use? If not, for how many hours of use does the datasheet specification actually remain valid?

If there is no good general answer, how long can I expect the AD8276 to stay above its minimum CMRR of 80 dB / 86 dB (depending on grade)?