Hi, we have a product with one DVI port connected to the HDMI input and Analogue input of an ADV7604. HDMI works perfectly, and I'm now trying to get analogue to work.
We have had analogue working if we know what the incoming resolution / frame rate is. So my current task is to figure out what the incoming video is.
So far I have got the SSPD and the STDI modules working correctly and telling me whether the incoming signal is analogue or digital (or potentially both). What I'm stuck on is converting the STDI outputs into the values for prim_mode and vid_std.
My calculation on FCL gives me the correct frame rate, and I can use block len to get something what I think is number of lines including blanking:
frame rate = XTAL / (FCL * 256).
lines per frame either = LCL or = (FCL * 256 * 8) / (block_len).
So from this I know the frame rate, and can guess (is there a better way?) the number of lines per frame. However now all I have is: ?x768,60. Which leaves me with XGA (vid_std = 0x0C), WXGA (vid_std = 0x10), WXGA (vid_std = 0x12). So how do I make that decision?
On top of all this we want to support 50Hz, which none of the prim_mode=GR settings support, so we'd have to use auto graphics mode. So in autographics there are potentially infinite resolutions that could give the same STDI results.
So all in all my question boils down to how do you get the resolution of an analogue signal.