I'm in the preliminary stage of figuring out the best solution for capturing video from an HDMI camera and bringing it into the board we use, which has an 8-bit video input port (TI AM3517 processor). The AM3517 supports a maximum 75 MHz PCLK signal and 16-bit video, but unfortunately we can only do 8-bit because the other pins are used for Ethernet. So we are limited to 8-bit, 75 MHz.
We're currently using an NTSC decoder to capture BT.656 8-bit 720x480 video, which works great, but it's not HD. We'd like to get at least 720p into this board. My question is:
Can we use the ADV7612 to get an HD resolution as 8-bit without DDR? 720p60 has a 74.25 MHz pixel clock, but since I'm using 8-bit data, it would be too fast. The AM3517 cannot sample on both rising and falling edges of the pixel clock signal, so DDR won't work. My first thought is to try to get a camera that can output 720p30 with a 37.125 MHz pixel clock, and then configure the ADV7612 to output in a BT.656-like format at 74.25 MHz ( so pixels still come in at 37.125 MHz, but the clock signal is 74.25 MHz for getting both luma and chroma).
Is the ADV7612 capable of converting a 720p30 input HDMI signal into a BT.656-like 8-bit signal? Will I have trouble finding HDMI cameras that will actually output 720p30 if my EDID requests it? Any other suggestions on how to get an HD video signal of at least 30 fps into the 8-bit port without DDR? BGA parts are currently outside of our assembly capabilities, so a solution that makes use of something like the ADV7612 is preferred.
Thanks for any input/suggestions/advice, and happy new year!