I have searched the internet for hours upon hours to find answers to my questions with no luck... Below are 2 circuits that are a part of a project I am working on.... These are the interface circuits between my system and a balanced signal TDM box with 600 Ohm termination (old school Voice Freq Channel to T1 multiplexer)... The problem is when I run voice grade tests through the system I am getting poor results and I am almost 100% there is impedance mismatch causing the problems. I don't have much experience with telephony interface circuits (or impedance matching between op amps and transformers) and am seeking help....
Figure 1: The input side of the circuit
Figure 2: The output side of the circuit
These circuits were derived from the ADAU1442-EZ board which is the DSP we are using for this application.
On the input side of the circuit a balanced signal comes into the transformer which has a 1:1 ratio. I have the 680 Ohm resistor to ground but I think I need to put a 2.16uF capacitor in series with that 680 Ohm resistor to match the line impedance, right?
On the output side I think I should remove the 22k Ohm resistor along with the 3.3nF and 4.7uF cap (replacing the 4.7uF cap with a 2.16uF), again to match the line impedance, but I am not 100% about this.
Is this problem a lot bigger than what I think? Part of our circuit also has a built in telephone hybrid (dual transformer design) which when activated bypasses the 1:1 isolation transformers (not shown in the above figures but the op amp circuits are the exact same). That is where I am seeing the worst voice grade test results (with the hybrid in the circuit).
I appreciate any feedback provided.
The intermodulation distortion at a 23-tone test signal of -15 dBm is returning a 3rd order of ~+50 dBm which is 1 dBm above the threshold. We want it to be >+60 dBm. The 2nd order is also low (~+55dBm). That is the main test that I am concerned with. The results seem to fluctuate quite a bit (+/- 5 dBm when we usually see +/- 2 or 3 dBm which I think is a "flanging" effect going on due to the bad impedance matching).
The board is a 4 layer board with the inner 2 layers being power and ground. I worked on the circuit design but we had a vendor do the actual PCB layout and production (prototypes). The primary frequencies we are interested in are 304 Hz to 3004 Hz at -13 dBm with the exception of the 23-tone test.
As for the test setup we are running a TIMS test set to a 66 block which connects to the card/cage via 60 pin telco cable. We've never quantified our circuits by THD before, but maybe I should look into it...
On the card itself after passing through the op-amp circuits it goes to a AD1938 codec and to the ADAU1442 for processing.
Could the internal filters of the DSP be causing the bad harmonics in the Intermod distortion test?
Thanks for responding to my post.
" The problem is when I run voice grade tests through the system I am getting poor results" What does this
mean?? 10% THD at 1 kHz?? 0.01% THD at 20 kHz?? 2.3 mW of audio into the line?
Quantify your test setup and results.
I am assuming this is your own pc board?
How many layers?
I'm more of an amps and volts guy, so I have trouble thinking in dBm. WRT THD, I can relate to that;
simply used that for an example.
A couple of comments
-- For maximum power transfer, you want TX/RX impedances matched. For maximum voltage transfer,
you want the source impedance to be zero and the receive impedance to be infinite. I don't see how
mismatching at 3 kHz would affect IMD.
-- Most rail to rail inputs are a p-pair in parallel with an n-pair, so they have crossover distortion. See figure 10
in the rev N data sheet. Also MT-035:
-- Sometimes the groups at semiconductor companies don't talk to each other enough. Looks like the DSP guys didn't check with the op amp guys. For precision audio I would not use a quad. See article(s)
Also, in the above article, see figs 10 and 11. A quarter of an inch of pc board trace made 50 dB difference in
crosstalk. If you had a vendor do the pc board layout, you may have several problems.
If you want the ultimate in performance, especially below 100 Hz, bipolars have a much lower 1/f corner.
I would also allow for +/-5V or +/-15V if necessary.
-- If you google "capacitor distortion", you will get some interesting stuff:
So on a new board, you may have one big problem or ten little ones.
The DSP, if running a pass-through program, should be essentially transparent from an analog-to-analog performance perspective.
If you are actively implementing filters during your test, however, then it's possible that they are having a negative impact on the performance, depending on their configuration. Did you try running the test with the DSP program bypassed?
Thanks for posting links; I will spend some time reading through those and looking further into possibly using different op amps.
Thank you for the reply. I created a pass-through program and uploaded it to the DSP; the 2nd order intermod seemed to improve, but the 3rd order intermod was the same if not a little bit worse... I figured out how to do a echo return loss measurement on our TIMS and if I have both ends terminated 2-wire @ 600 Ohms I get ~ -20 dBm ERL but if I unplug the far end it drops to ~ -10 dBm (which I expected it to be worse). I found a document online that had the schematics for a radio to telephone interface circuit that I plan on taking bits and pieces from and adapting into my design to see if it improves at all...
If you are curious as to what I found: http://www.sunairelectronics.com/web/workspace/uploads/rtu-200-1326998987.pdf
Again, thank you for your feedback.