Now that I got the AD9649 ADC running nicely with IIO, I arrive at the next challenge.
The data is actually sort of video, and a small logic block processes the raw incoming samples and detects the frame start sequence and cuts away the "gutters" in the signal.
What remains of the data are frames of constant size.
The problem I now have is that when I start the transfer for the first time, and make the IIO buffers just large enough for a single frame (about 200k) I get a frame in each buffer. But when the system underruns, or when I stop and restart, the residual data in fifo's will cause the data to no longer align, and the synchronization with the frames is lost.
The underrun case can be solved by flushing the pipeline and starting over. But it's impossible to re-align, because when the DMA starts it will just randomly "jump in" on the data stream.
Is there some way that I can synchronize the startup?
The logic block has a AXI-slave interface, I could make it mimic the axi-adc-core memory map and handle the start/stop commands by flushing the fifo.
If this can't be done with IIO, I can always revert to a full-custom solution, but I like the libiio approach that allows to quickly develop applications on a network PC.