Pixel level timing jitters cause by non-deterministic timing |
Prior to this point, the idle loop in main() was a simple loop. The timing are deterministic and there was no interrupt jitters. Now that I have added additional code, the timing is no longer deterministic on when an interrupt handler can start. Jitters of a couple of clock cycles can cause display like this.
Most other similar projects place their code in SRAM to avoid these types of issues, but I am low on SRAM and only want to code every in standard C code without compiler hackery.
I thought I had it figured out last night by thinking that I could use the Timer DMA to write to the DMA register to enable another channel. Must be my lack of sleep. Of course the DMA caught this and flag it as a Transfer Error as it wasn't expecting itself to be a target. It is kind of too smart at being stupid.
¯\_(ツ)_/¯
I have something else to try, but it is going to be very ugly.
Didn't have much luck with the WFI (Wait for Interrupt) function that puts the chip into sleep at the horizontal sync interrupt and only to wake up to start the SPI DMA.
The cunning plan was that sleeping beauty would see SPI DMA as she opens up her eyes and no one else after the eternal sleep. Unfortunately, the spell wasn't broken.
Somewhere along the line of rearranging code, the pixels aligns and the text was right again when things are at idle. The screen glitches when PS/2 keyboard is pressed. Just arranging code messes up the display at idle. This is not a workable solution.
I tried using TIM3 Channel 3 Output Compare to trigger a single transfer on DMA channel 2 . For some reasons, it started the transfer without waiting for the compare. :( So looks like I can't even use this to start the first transfer and follow up with IRQ DMA block transfer to make it less timing sensitive.
Might have to come back to it as I am out of crazy ideas right now..
SUCCESS!
I have used my sleeping beauty routine, but done it differently. Basically I caused a PendSV (Pending Supervisor) exception right after I all the registers for SPI DMA are set up. PendSV IRQ has the 2nd highest IRQ.
Upon entering, all the lower IRQ are disabled e.g. UART and PS/2.
The IRQ for the peripherals still get a chance to run once the Timer IRQ finishes. This means that rest of the code no longer interferes with the VGA timing and there are no glitches when a key is pressed like before.
Reference material:
A Beginner’s Guide on Interrupt Latency - and Interrupt Latency of the ARM® Cortex®-M processors
EmbeddedGurus: What’s the state of your Cortex? explains what PendSV is used for.
¯\_(ツ)_/¯
I have something else to try, but it is going to be very ugly.
Didn't have much luck with the WFI (Wait for Interrupt) function that puts the chip into sleep at the horizontal sync interrupt and only to wake up to start the SPI DMA.
The cunning plan was that sleeping beauty would see SPI DMA as she opens up her eyes and no one else after the eternal sleep. Unfortunately, the spell wasn't broken.
Somewhere along the line of rearranging code, the pixels aligns and the text was right again when things are at idle. The screen glitches when PS/2 keyboard is pressed. Just arranging code messes up the display at idle. This is not a workable solution.
I tried using TIM3 Channel 3 Output Compare to trigger a single transfer on DMA channel 2 . For some reasons, it started the transfer without waiting for the compare. :( So looks like I can't even use this to start the first transfer and follow up with IRQ DMA block transfer to make it less timing sensitive.
Might have to come back to it as I am out of crazy ideas right now..
SUCCESS!
Jitter free video |
Upon entering, all the lower IRQ are disabled e.g. UART and PS/2.
- It clears interrupt and go to sleep. The Timer IRQ for SPI DMA is the only one that can wake it up and get serviced. Sleep is pretty deterministic. :)
- The Timer IRQ that wakes up the ARM has highest priority and will be service next.
- The lower priority IRQ get re-enabled. This probably get execute after the Timer IRQ returns.
The IRQ for the peripherals still get a chance to run once the Timer IRQ finishes. This means that rest of the code no longer interferes with the VGA timing and there are no glitches when a key is pressed like before.
Reference material:
A Beginner’s Guide on Interrupt Latency - and Interrupt Latency of the ARM® Cortex®-M processors
EmbeddedGurus: What’s the state of your Cortex? explains what PendSV is used for.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.