I need some help on the following. This question was moved from another post to this one as the other post was answered and finalized.
A Blackfin (BF514F) processes a bunch of measurements and then transmits it at 3125000 bps to an ARM. The ARM then replies with a ACK message. The TX to the ARM works fine but receiving from the ARM not so much. At 3125000 bps we have picked up that the DSP misses messages (CRC Fail). With further investigation we have determined that some of the byte's received are corrupt, but only half of the byte (a nibble) and only a few bytes at a time, not all. And only sometimes. I have set the DMA to receive the whole packet to make sure it is not my 1-byte DMA setup that is causing this problem. We have also check the clock rates on the scope and everything seems fine. Can you think of anything else that could cause this?
I have set breakpoints at UART errors to determine if its a break or frame error or even a DMA error. But I get no error breaks. We have also accidentally found that setting the DSP to 921600 bps and the ARM to 912600 (like I said, by accident this worked, or by typo, haha) the communication works fine. But for our application to finish everything in time we will need 3125000. We have used the exact same DSP and ARM on a previous project and it worked on 3125000 bps. The only difference was that the UART driver RX on the DSP was interrupt driven and not DMA driven.
Is there something on the DMA receive Im missing? Could it be another variable corrupting the DMA UART RX buffer, which I have checked and cant trap or find if it is.
I think I know why the 921600 - 912600 bps combination works. I have seen that the DSP UART at 921600 bps has an actual baude rate closer to 912600. So maybe that is why that worked. But at 3125000 the DSP UART has almost no error. As far as we can see the arm has a very good baud rate generator (64 times oversampling).
Any ideas would be welcome thanks.