I have kind of a strange product to design, which I think starts with a multiplying DAC.
I'm given a DC voltage somewhere in the range of +5 to +10V. When the input is +10V I have to put out a signal which can range from -60mV to +60mV which is scaled from the DC input voltage. In other words, if the DC signal drifts a little, my scaling needs to drift with it, and if it drops to 5V the output would be half of what it was at 10V.
I need resolution on the output of about 1uV. The output change rate can be very slow, 1Hz update rate for example.
My thinking was to scale the input down before going into the DAC since otherwise I'm wasting a lot of the range.
Can you point me in a direction to start looking? Are there DACs that can multiply with such small input voltages?
Generally speaking, you don't want to attenuate input signals especially if they are very small to begin with. But since the signal not that small (5V to 10V), placing the scale down at the reference…
Someone is currently looking into this query and will respond to you as soon as possible.
Just to be extra clear: What I meant when I said "The code value written to the DAC sets what the output voltage is" is this: When the input voltage is 10VDC, I need to be able to program the DAC to generate any voltage between -60mV and +60mV. When the input voltage is +5VDC, I need to be able to program the DAC to generate any voltage between -30mV and +30mV. In between +10v and +5v input voltage, the output needs to scale, so that at +7.5V input, my possible output range is +/-45mV. The output voltage needs to be a programmable fraction of the input voltage.
You may want to look at AD5543 for your application. We are also thinking of the same thing about scaling down the input first to maximize the DACs resolution.
If you have a schematic that we could take a look at, we could provide more suggestions/recommendations as needed.
By the way, what is the end application of your circuit?
Rainer,Thanks, I took a look at the AD5543 and it looks good.Do you think it would be better to scale the input voltage down by a factor of 10 (Vref would be between 0.5V and 1V) and then use the MDAC? Is it practical to work with such a small reference voltage? Or would it be better to scale it down after the MDAC?There are no schematics yet, it's just an idea. Generally speaking, it is a simulator for a ratiometric sensor. Think of a volatile gas sensor for instance. The instrument supplies a voltage, and this device will return a variable, small fraction of the input voltage simulating sensor operation.Thanks for your help,Lloyd
Generally speaking, you don't want to attenuate input signals especially if they are very small to begin with. But since the signal not that small (5V to 10V), placing the scale down at the reference should be fine.
The critical thing here is to choose matching resistors for the gain/attenuation settings as well as low error/noise amplifiers so that you won't degrade the DACs performance and still achieve the 1uV steps at the output.
Thanks very much. Issue closed.