We are experiencing an issue with adv7850 where the output video signal can freeze if the input CVBS signal from an anlog TV tuner is noisy. This doesn't happen when we have TBC feature enabled, with which playback doesn't get interrupted even with a very noisy signal, but we have had to disable TBC because it introduces an artifact with PAL inputs. Regardless of that, I just want to focus here on finding a reliable way for measuring CVBS signal quality.
In the SDP block I see that sdp_synctip_noise and burst_power are different indicators of signal quality. I'm having more success with burst_power measurements, but it is not very consistent. I was hoping you might provide some guidance or best practices on how to use these metrics or others to measure quality of the input composite signal. We are hoping that this could allow us to selectively enable TBC only if signal quality degrades below a certain threshold.