I am trying to develop a package which contains an IQ mixer (probably around the 4-8GHz band), and automatically calibrates the mixer (corrects the IQ plane for offset, skew, and rotation).
The calibration needs to be quite good ideally >60dB contrast so off really needs to be off.
I am currently working on the first part, the offset which contributes to leakage of the LO when the device is nominally off.
We currently do this using a suite of test equipment. We use a spectrum analyzer to measure the leakage, and adjust the (1GS/s) dac that is driving the IQ voltages to minimize the measured power. This works well enough to achieve the desired contrast but uses >$50k worth of test equipment. Performing a one time calibration works to some extent but is not good enough due to drifts.
My plan was to try and somehow lock a VCO or other cheap tunable frequency source (I'd be interested to hear what might work best for a wide band tunable source) to an 10 MHz offset. Then use the VCO as a local oscillator to mix down, and bandpass the leakage, and then feed it into a log power detector. From there using a gradient descent type algorithm for nulling the leakage.
I've tested a relatively cheap and easy solution for the detection if provided with an appropriate LO, but currently I am using a $30k generator as the LO so I haven't gained much. The next step is to make the oscillator which can lock to a specific offset. Or alternatively to develop a better path for measuring the leakage power.
I'm very interested in any ideas as to how to improve the approach or implement a cheap slave oscillator at an offset with respect to one which is provided.