I'm thinking through the analog front end for a data acquisition system interfacing to a differential sensor outputting a very small voltage. The peak amplitude of the sensor is on the order of 100-200uV. My question is more of a systems/architecture question regarding the difference between analog and digital gain. I see that some people use a digital gain by multiplying the data samples by some value after the sensor signal is sampled. Another approach is designing in an amp with a gain to bring the signal magnitude up so the ADC has a wider swing signal to sample.
I'm wondering which approach is better or what trades are associated with each as the digital gain seems easier done in firmware (just multiplies) rather than adding more amps and circuitry in the analog front end signal chain. I'm a little hesitant to go the digital route though because there is more dynamic range with the analog gain which seems beneficial but I don't have a great understanding of why and what some of the trade offs are.
Could someone comment/shed some light on the two different approaches here from an architecture perspective?