This Analogue Dialogue paper from 2005 on Weigh Scale Design is excellent:
The authors highlight the typical signal strength from a load cell:
"A typical load cell’s electrical sensitivity, defined as the ratio of the full-load output to the excitation voltage, is 2 mV/V. With 2-mV/V sensitivity and 5-V excitation, the full-scale output voltage is 10 mV. Often, in order to use the most linear portion of the load cell’s span, only about two-thirds of this range would be used. The full scale output voltage would thus be about 6 mV. The challenge thus posed is to measure small signal changes within this 6-mV full-scale range in such a way as to get the highest achievable performance—not an easy task in the industrial environments where weigh scales would typically be used."
Then give an example of the right kind of chip to use:
"The ADC should also contain a low-noise programmable-gain amplifier (PGA) with high internal gain to magnify the small output signal from the load cell. An integrated PGA can be optimized to give low temperature drift, as compared to a discrete amplifier with external gain resistors. With a discrete configuration, any errors due to temperature drift will get amplified through the gain stage. The AD7799, specifically designed for weigh-scale applications, has an excellent noise specification (27 nV/rt-Hz) and a front-end gain stage with a maximum gain of 128 mV/mV. The load cell can be directly interfaced to this ADC."
My question is, why did Analogue choose a gain of 128 for this application?
Given the mV input (6mV full scale) 256, 512 or even 1024 would seem natural choices to take full advantage of the ADC.
Presumably there are other factors....