Currently there are two options to play video games: Use vsync, which causes frame rate stuttering if the game renders more slowly than the monitor's refresh rate, or not use vsync, which causes image tearing (unless the game can render at exactly the same framerate as the monitor's refresh rate; which is practically impossible without vsync.)
Frame stuttering happens when the game is constantly jumping between eg. 60Hz and 30Hz frame rate, due to the game actually rendering at something in-between those two, and syncing with the monitor's 60Hz refresh rate. Same thing happens, but even worse, if the rendering speed drops below 30Hz. On the other hand, frame tearing causes annoying horizontal artifacts when the image updates while the screen is refreshing. The position of the tear tends to jump randomly on the screen on each refresh, worsening the effect. (And if the game is actually rendering faster than the monitor's refresh rate, you'll get multiple tears on each refresh.)
The absolutely optimal situation would be, of course, if the game can render at least as fast as the monitor's refresh rate (usually 60Hz nowadays) and you use vsync. This way you get neither problem and the game is smooth and without tearing artifacts. However, this is hard to achieve with modern resource-intensive games, at least if you want to bump up the rendering quality. (Or, alternatively, even with older games if your computer is too slow.)
An even better solution to this problem is coming in the near future, with newer monitors. The idea is that rather than the graphics card having to sync with the monitor, the dependency is reversed. In other words, the monitor syncs with the graphics card. This way the monitor will display each frame immediately when it's ready (regardless of how long it took for the graphics card to render it), and thus you never get screen tearing, and the frame stuttering is minimized (basically being solely up to how fast the graphics card can render the image.) In other words, it will be exactly like playing with vsync turned off, but without the screen tearing. However, we are not still there.
What puzzles me, however, is most people's perception and attitude towards the stuttering vs. tearing dilemma.
Personally, I have hard time even noticing frame stuttering. If I look really closely, I can see it, but if I'm not paying attention then I don't even notice it. Even when I do notice it, it doesn't bother me almost at all, for some reason. Screen tearing, however, annoys me a lot. I just can't stand it. (This is true both in games and movies, if the media player happens to not vsync for some reason.) It annoys me so much that I have hard time playing a game that has no vsync option (fortunately this is very rare.)
However, if you search on the internet, the majority view seems to be the exact opposite: Screen tearing doesn't seem to bother almost anybody, while frame stuttering annoys the hell out of people. (Some people even write comments along the lines of "why do games even offer vsync anymore? It makes games run like crap. Who cares about tearing, it doesn't bother me.")
I don't really understand why I seem to be in a minority with this. I actually have hard time telling if the game is frame-stuttering, but tearing is so glaringly obvious that it's irking to a level that makes it difficult to play such a game.
There also exists a third option, a technique that can alleviate (if not even completely remove) stuttering (even without support for g-sync or the like): Triple buffering.
This technique is most useful if the game runs on average around 60 FPS, maybe a bit higher, but may regularly drop below that critical line from time to time. In basic vsync mode this means that the framerate will likewise regularly drop to 30 FPS, while at times it's at 60 FPS. This may introduce visible stuttering that's exacerbated by the fact that it's intermittent rather than constant.
If the average framerate is slightly over 60 FPS, and the variance in framerates is not enormous from frame to frame, however, triple buffering can get rid of the stuttering completely and make the game run at a constant 60 FPS. It's a simple trick, but can work like magic.
In essence what triple-buffering does is that rather than showing frames immediately when they have been rendered, the game has a "queue" of three frames: The one being currently shown on screen, the next frame (which usually has already been fully rendered), and a third frame being currently rendered. The game waits for vsync and starts showing that second frame, and rendering a new third frame at the end of the "queue".
What this does is that if the rendering temporarily drops below 60 FPS, the display has been given a bit of "leeway" in that it still has a full frame to show, while the game is now rendering the next frame. In other words, essentially, if the rendering speed drops slightly below 60 FPS, the "queue" will now be reduced to just two frames, but that's still enough to maintain 60 FPS as long as the display doesn't "catch up" with the renderer. The 60 FPS display can still be maintained for a time before this happens. As long as the rendering can go back to over 60 FPS soon enough, the rendering rate will be uninterrupted. (If the rendering drops just slightly below 60 FPS, eg. 55 FPS, this can actually keep going on for several seconds before the display catches up with the renderer.) If the rendering goes back to over 60 FPS, the renderer can now start "filling up" the queue once again, and get ahead of the display once again.
This is not just theoretical or extremely niche. It actually works, and it works beautifully. (Of course the game has to support this.)
The downside of this technique is that it introduces a latency of two frames between the input and its effects on screen. However, for normal players this is practically unnoticeable (only pro FPS players and other HC players of their caliber will notice the latency).
Frame stuttering happens when the game is constantly jumping between eg. 60Hz and 30Hz frame rate, due to the game actually rendering at something in-between those two, and syncing with the monitor's 60Hz refresh rate. Same thing happens, but even worse, if the rendering speed drops below 30Hz. On the other hand, frame tearing causes annoying horizontal artifacts when the image updates while the screen is refreshing. The position of the tear tends to jump randomly on the screen on each refresh, worsening the effect. (And if the game is actually rendering faster than the monitor's refresh rate, you'll get multiple tears on each refresh.)
The absolutely optimal situation would be, of course, if the game can render at least as fast as the monitor's refresh rate (usually 60Hz nowadays) and you use vsync. This way you get neither problem and the game is smooth and without tearing artifacts. However, this is hard to achieve with modern resource-intensive games, at least if you want to bump up the rendering quality. (Or, alternatively, even with older games if your computer is too slow.)
An even better solution to this problem is coming in the near future, with newer monitors. The idea is that rather than the graphics card having to sync with the monitor, the dependency is reversed. In other words, the monitor syncs with the graphics card. This way the monitor will display each frame immediately when it's ready (regardless of how long it took for the graphics card to render it), and thus you never get screen tearing, and the frame stuttering is minimized (basically being solely up to how fast the graphics card can render the image.) In other words, it will be exactly like playing with vsync turned off, but without the screen tearing. However, we are not still there.
What puzzles me, however, is most people's perception and attitude towards the stuttering vs. tearing dilemma.
Personally, I have hard time even noticing frame stuttering. If I look really closely, I can see it, but if I'm not paying attention then I don't even notice it. Even when I do notice it, it doesn't bother me almost at all, for some reason. Screen tearing, however, annoys me a lot. I just can't stand it. (This is true both in games and movies, if the media player happens to not vsync for some reason.) It annoys me so much that I have hard time playing a game that has no vsync option (fortunately this is very rare.)
However, if you search on the internet, the majority view seems to be the exact opposite: Screen tearing doesn't seem to bother almost anybody, while frame stuttering annoys the hell out of people. (Some people even write comments along the lines of "why do games even offer vsync anymore? It makes games run like crap. Who cares about tearing, it doesn't bother me.")
I don't really understand why I seem to be in a minority with this. I actually have hard time telling if the game is frame-stuttering, but tearing is so glaringly obvious that it's irking to a level that makes it difficult to play such a game.
There also exists a third option, a technique that can alleviate (if not even completely remove) stuttering (even without support for g-sync or the like): Triple buffering.
This technique is most useful if the game runs on average around 60 FPS, maybe a bit higher, but may regularly drop below that critical line from time to time. In basic vsync mode this means that the framerate will likewise regularly drop to 30 FPS, while at times it's at 60 FPS. This may introduce visible stuttering that's exacerbated by the fact that it's intermittent rather than constant.
If the average framerate is slightly over 60 FPS, and the variance in framerates is not enormous from frame to frame, however, triple buffering can get rid of the stuttering completely and make the game run at a constant 60 FPS. It's a simple trick, but can work like magic.
In essence what triple-buffering does is that rather than showing frames immediately when they have been rendered, the game has a "queue" of three frames: The one being currently shown on screen, the next frame (which usually has already been fully rendered), and a third frame being currently rendered. The game waits for vsync and starts showing that second frame, and rendering a new third frame at the end of the "queue".
What this does is that if the rendering temporarily drops below 60 FPS, the display has been given a bit of "leeway" in that it still has a full frame to show, while the game is now rendering the next frame. In other words, essentially, if the rendering speed drops slightly below 60 FPS, the "queue" will now be reduced to just two frames, but that's still enough to maintain 60 FPS as long as the display doesn't "catch up" with the renderer. The 60 FPS display can still be maintained for a time before this happens. As long as the rendering can go back to over 60 FPS soon enough, the rendering rate will be uninterrupted. (If the rendering drops just slightly below 60 FPS, eg. 55 FPS, this can actually keep going on for several seconds before the display catches up with the renderer.) If the rendering goes back to over 60 FPS, the renderer can now start "filling up" the queue once again, and get ahead of the display once again.
This is not just theoretical or extremely niche. It actually works, and it works beautifully. (Of course the game has to support this.)
The downside of this technique is that it introduces a latency of two frames between the input and its effects on screen. However, for normal players this is practically unnoticeable (only pro FPS players and other HC players of their caliber will notice the latency).
Comments
Post a Comment