For digitizing a signal, is there a limit for choosing the number of bits in ADC?. like if i digitize a signal with 12 bit and get a good enough signal, and i then further digitize it with higher number of bits (say 24 bit or even more).
There is a limit beyond which increasing the number of ADC bits (which reduces quantization noise) will not improve performance. It depends on the presence of other noise sources and if these sources dominate the quantization noise level. Here are some basic qualitative examples.
In in some of our 14 bit high speed converters, because of the presence of thermal noise, going to 16 bits, with all other factors being the same, would result in very minimal performance improvement. In these cases thermal noise would be the limiting factor, not the number of bits resolution (quantization noise).
Another factor is sample clock jitter. In the presence of sample clock jitter, SNR will reduce as the frequency of the input signal increases. Depending on the magnitude of other noise sources, the frequency of your input signal, and the magnitude of clock jitter, clock jitter could be the limiting factor in performance above and beyond quantization noise.
There are resources on analog.com that go into further depth on these topics.
You mean digitizing the signal twice? Why not choose the higher resolution directly. Choosing what ADC resolution would depend on the application or what type of data that you want to capture. SAR ADCs are best on specific application and the same goes with Sigma Delta ADC. May I know the application you want to build.
Retrieving data ...