A normal motion looks like this:
Most PC gamers know a fact: if V-sync is off, tearing may occur in some frames:
The tearing artifact is caused by framebuffer swapping. If the monitor happens to read the front framebuffer during the swapping, the old content mixed with new one is presented onto the screen, and tearing occurs.
Recently I've encountered another type of syncrhonization problem in my work project. I was developing animation module for mobile map apps, which involves these steps:
- C++ animation timer callbacks at 100HZ, and the map camera's state is updated
- iOS app adaption-layer also receives the callback and set a variable for refreshing
- iOS app 60HZ-fixed-rate render method of CADisplayLink checks the variable and refresh a new map frame
When degugging I found the animation a little bit stuttering, just like the above example without tearing:
The problem is that the timer in step 1 and rendering in step 3 are not synchronized, so at the time the map is rendered, the camera state is out of date. To fix this, simply delay the update of the camera to render-time, and the animation would become really smooth.
Similarly, if the V-sync is turned on in games, there will be lagging as well. It is the place where techniques like Nvidia G-sync come in handy.