Hello,
I am trying to drive an ADC with a high impedance differential input by using two (single ended) ADL5535 amplifiers to serve as the differential input. A few questions:
1. Is it wise to use two amplifiers in this manner or should I stick to using either one differential amplifier or by using a 1:1 transformer to convert from single ended to differential?
2. Is it better to implement anti aliasing filtering before or after transforming from single ended to differential?
3.I am trying to drive a 16bit ADC from TI. the ADC16DX370EVM. I believe I may have an issue with a common mode voltage between the two amplifiers. I have read that it is wise to place the output of your amp as close as possible (with filtering) to the input of the adc....Currently for testing purposes I am using a 1 inch long jumper cable soldered to the board to make the connection....could this also be giving bad performance or will the degradation be minimal for testing purposes. I am using a 30 gauge wire twisted pair jumper cable and have a third wire that ties the ground of the adc board to the ground of my amp test board.
Any considerations, feedback or notes that help push me along would be great.