Skip to main content

Misconceptions about (video game) framerates

For some time now there has been a kind of "mini-controversy" of sorts in video games related to frame rates.

Most "hard-core" PC gamers (and to an extent some console gamers as well) are not happy with anything less than 60 frames per second. It's kind of the absolute minimum. If the framerate drops below that, it's quite a big deal. Many people immediately notice the framerate drop, especially when using vsync. (Personally I don't notice it that much, nor does it bother me all that much, but I can certainly understand.)

On consoles, especially on the 7th generation (ie. the Xbox 360 and the PS3), 30 frames per second was (well, still is, as of writing this, as the generation has not yet completely died out) pretty much the universal standard, with only few exceptions. The reason for this is that most game developers want to enhance the visual quality of their games, but naturally this will be slower to render, so it's done at the cost of framerate. Thus 30 FPS (which is half of the most common TV/monitor refresh rate) has been seen as a completely acceptable compromise between rendering speed and quality.

Sometimes the 30 FPS limit "leaks" to the PC, in the form of games that have been primarily developed for consoles, with a PC port being just a secondary goal. The thing is that some games have been internally designed (in their implementation) to run on a 30 FPS system, and might not even work properly on a faster system. For example in some games the physics engine goes haywire if the game is run at 60 FPS because it has been designed for 30. In others eg. character animations are designed for (or captured at) 30 FPS and won't work properly at higher framerates. (There are actual examples of both, but I don't remember names right now.)

The easiest way to "fix" the problem when porting a game to the PC is to simply artificially cap the refresh rate to 30 FPS. This annoys many PC gamers, whose computers would be more than powerful enough to run the game at well over 60 FPS, and who can't stand the lower framerate.

The framerate discussion has only heated up with the new generation of consoles. There's great dispute over whether games for them should aim for full 60 frames per second, or whether 30 is still ok. Some game companies have announced that they will aim at the former, while others have maintained the speed/quality tradeoff even on the newer consoles and aim for 30. Basically they want the game to look as good as possible, at the cost of the framerate.

Anyway, after this really long introduction to the subject, I can finally get to the actual point, which is the misconceptions surrounding framerates. People (even some game developers) who defend the 30 FPS limit sometimes have all kinds of misconceptions about it.

There's a widely held misconception that 24 frames per second is the maximum that humans can discern, and anything higher does not add anything. Thus, they say, 30 FPS is more than enough, and anything higher is useless.

This misconception comes from movies using 24 frames per second. The thing is, that number was not chosen because anything higher makes no difference. It was chosen because it's approximately the minimum frame rate possible that gives a convincing-enough illusion of continuous movement, especially when used with motion blur. Originally they had to minimize the frame rate to be as low as possible both because film was very expensive, and also because you could only have so much of it in one single reel. If for example, 10 FPS would have been enough to give the illusion of smooth motion, they would have used that.

However, the human brain is perfectly capable of clearly seeing the difference between 24 FPS and higher framerates. For example, if you watch a video at 30 FPS and then the same video at 60 FPS, there is a very clear difference. (In fact, if you watch the 60 FPS video for a while and then switch back to the 30 FPS video, the latter will actually start looking a bit stuttery in comparison.)

The difference can be accentuated with high-resolution crisp-clear images, which is what computer games are (even if they artificially add motion blur).

Another misconception (or rather, excuse) that some game developers have is that 30 FPS makes a game feel more "cinematic". However, this seems completely spurious. I don't think any gamer would describe a lower-framerate gaming experience as "cinematic". It's just stuttery. And in any case, a video game is not a movie.

Comments