Need your help to eliminate or clean up the subcarrier leakage on Composite output. On our board, with the 100% flat field NTSC
SD source, I measured about 14mVpp of Subcarrier (3.58MHz) leakage, and in turn will affect the SNR.
All register settings are following ADI recommended scripts.
We also did an interesting experiment:
We injected 80h on the input of the ADV7343 (simulating monochrome), this is what we saw with an SDI input. Our output was beautiful, no subcarrier leakage. We then varied the input by 1 LSB to 81h, this produced 14mV of subcarrier leakage(358MHz). Just to verify our findings, we varied the input by 2 LSBs…hence we got 28mV of leakage.
The question is:
Why do we get so much leakage with an error of one LSB count on the chroma bus? We would expect around 3 to 4mV with each count.
We have the ADV7343 evaluation board and do not see this leakage issue so I’m looking into every avenue of the design.
- The first thing I notice is pins 45 (COMP1) and 35 (COMP2) are different between the eval and our card. The eval is 2v and 3.3v respectively while our card is 2.5v and 2v. What would make them be different, is it because we have different RSET values (all our DACs drive into 300ohm loads)?
- Our chip markings show “-3” after the part number, the eval doesn’t.
- We have the test pins (2,3,14,15,51,52) pulled down to ground via 100ohm resistor. The eval looks like they are being driven by the 7403.
Any guidance how to determine cause of this issue is appreciated.