I'm working with the ADV7441A, and am finding that the nominal gain needed on the input path is ~ x2
You can see this also in the hardware datasheet (Hw Rev J), page 141, figure 41. and Figure 46 (page 155)
When using the AGC and the SSPD does not detect syncs the gain is set to x1.96 or x2.29 depending on the OP_656_RANGE selection.
Why is this not 0.859 or 1.164 as it set for a HDMI input?
This is implying that the analog samples are multiplied x2 prior to the CSC. I'm not sure why this is needed?
Is this simply to adjust the fixed point value to have the same precision later in the pipeline or is the adc attenuating the video x0.5 ?