We are using an ADV212 in 32bit HIPPI mode for encoding.
It appears that the ADV212 requires all input bursts to be complete (meet the set input size), even if a frame does not require the last burst to be the full burst size. We are using a camera that is 1000x1000 pixels and thus not a power of two (or multiple of 128 d-words) in pixel size. What is the proper way to handle this and still get valid pixel data?
We can zero pad to a complete 128 burst, the issue I have with this is does the chip use every pixel for an image (as in the padding pixels will be the first on the next frame)? This would be bad for us as we really need the chip to process before data for the next frame is available.
The best for us would be if the chip did not have the ability to have multiple images in a burst, and would just chunk the extra (invalid) bytes (or if it would accept a shorter burst knowing that the image ends in the middle of a burst).
Can an engineer explain to me what the ADV212 is actually expecting and going to do in this situation?