Working on a custom board with several AD5535B's driving an array of devices for testing, driving the chip in the 0-50V range (VREF=1.0 V). With the chip in power-on-reset state and all channels zeroed, I observe a steady-state DC output voltage of around 100-300 mV, apparently randomly depending on channel. This offset seems to be preserved as voltage outputs are commanded on the DAC channels. I see this both on my own board and on the AD5535B eval board with jumpers set according to UG-730 Fig 2.
Is this an expected level of channel-to-channel offset? I'd assume the behavior of the eval board indicates it's inherent to this high-voltage chip, but it's always possible I've misconfigured the eval board as well as my custom one.
If it's normal, any recommendations for calibration procedures beyond compiling a brute-force lookup table? Should I be able to use a linear correction with offset to scale my inputs, or will I need to compute a nonlinear curve fit? This is being driven via SPI from a Xilinx Zynq-7000 with SPI cores in the FPGA fabric; if there are any drop-in logic cores that could be of assistance I'd love to know about them.
If this isn't normal, any recommendations for tracking down the root cause?