Good Day everyone!
I'm interested in interfacing a camera on the BF537 EZ Lite kit. For this, I have an AV EZ extender and an omnivision OV7720 camera module. Analog devices provides a sample program named ezVideo Sensor which demonstrates how to retrieve an image using the OV6630 sensor (and two other sensors). In the main.c file, the settings for each sensor are defined, example:
#define POL_C 0x4000
#define POL_S 0x0000
#define PIXEL_PER_LINE 352 // all = 566; act = 352
#define LINES_PER_FRAME 288 // all = 313; act = 288
#define DATALEN 0x3800
#define DataPacking 0x0000
#define PPICOUNT 351 // all = 0; act = 351
Since I have a OV7720 camera module, I tried changing some of the settings (PIXEL_PER_LINE and LINES_PER_FRAME). However, I'm not familiar on where settings of POL_C, POL_S, DATALEN, DataPacking, PPICOUNT are taken from and what these settings are for. I've tried running the program and loading it to the board, but all I get is a solid colored image (sometimes green sometimes black depending on the pixel format in the image configuration window (Debug Windows > Image Viewer)).
To summarize, can anyone help me with the following questions?
1.) Where in the datasheet are the settings for the camera based from and what are they for?
2.) From the datasheet (attached), it says that the output format is YUV/YCbCr 4:2:2 and RGB566/555/444. How do you choose the output format? (Is it based on the output timing diagram?). Also, in the debug window image viewer, the YUV/YCbCr 4:2:2 is not available, so I'm wondering if I can use this format if ever...
3.) I'm a bit confused with the code. I always see ezErrorCheck(), but looking at the function, it's blank? I'm not sure what it's for. Can anybody guide me with the code?
Also attached is the sample code provided with the installation of Visual DSP++