I am using the ADSP-BF561 EZKit
I would greatly appreciate any assistance I can get. Currently, I participate in an ongoing hobbyist project. We have remote analog video working wonderfully and would like to convert the analog video to digital to maximize distance and video clarity.
My first goal is to prove that the Blackfin 561 can sample a frame of video on CoreA and encode it to JPEG then CoreB decodes the JPEG buffer and sends a frame of video out the PPI to be displayed. I understand this approach is not optimized for bandwidth but, I need to get as close to realtime video (no perception of delay) as possible. Reaching the first goal will demonstrate that the Blackfin 561 is capable of the task and I believe it is. A future design can be adapted to transmit packetize video via 802.n but, only if the first stage is possible.
I downloaded and have been working with the JPEG_Encoder Rel 3.1.1 for weeks now in my spare time with no progress. Using the example code and documentation has left me confused about how the library is designed to function. While it seems very straightforward still, I cannot get it to work – that is, encode a frame of sampled video.
I know there are other possible solutions for this project but, I prefer using the blackfin – If I can make it work. H.264 has been a consideration but, I cannot find out how many frames of delay it introduces. This digital video process may sacrifice bandwidth for any delay introduced by the encoding/decoding process. (2-4 frames behind is tolerable)
Does anyone have any working sample code that you wouldn’t mind sharing with me? I have a strong desire to learn and it is the reason I participate in hobby projects.