So I am using the IMAGEON HDMI Kit for Xilinx reference design, with an ML605(virtex-6 reference design).
I wish to grab for the time being, HDMI out from my PC, running 1920 x 1080.
On the back-end, I wish to grab 16-bit, SDR, 4:2:2, BT.656 Embedded Syncs, as this is the format based on the IMAGEON FMC Card schematics.
So my question is that my video card can only run at 'high-color' (16-bits) and 'true-color (32-bits). The ADV7611 looks like it can take in several different color depths, adhering to the HDMI v1.4 standard, which is 30/36/48 bit depth color. So can I run this using this 32-bi color setting on my graphics card? Or do I have to set it to 16-bit?
One other newbie question I have:
The output of the ADV7611 can be set to various bit depths such as 16-bit, on the back-end. I assume that using the I2C, I just set that back-end setting of 16-bits, and it will just do the conversion for me..if this correct?
Are there any details or misunderstandings that anyone can comment on for me?
Thanks for the help,