This has been a perennial problem for me.
Yesterday the production engineer on a design (who I've been losing 3 out of 3 wrestling matches with) called up Analog Devices and was told that the AD8302 was designed to be extra-stable over temperature (as opposed to all the accompanying data sheets which show that the chip should not be used when the output DC level is expected to be DC stable).
My measurements use the 1.8V reference, a separate 3.0V reference, and the phase output to shift the measurement to 1.5V DC before doing a 12-bit conversion. So I assumed that the two chip outputs track somewhat over temperature, and I am only using 0.9 +/- 0.066 volts at the phase output.
My reference for 3.0V is a REF3330AIDBZT. My two phases are generated in an AD9958BCPZ. The smallest phase adjustment in the AD9958BCPZ is 0.022 degrees. There are sources of phase drift in transformers, in the gain stage, humidity on the cables, but these are dwarfed by drift in the AD8302 itself. And this drift is all there in the data sheets.
Will using a better 1.8V reference make the circuit more temperature-stable?
Also, is there any relationship between the power-supply voltage input and the 1.8V reference output?