Skip to main content

Why are so many computer programs so slow?

Launch almost any program installed in your computer. What's the first thing that you notice? It may take at the very least a significant fraction of a second to show on screen. With some programs it may in fact take several seconds. And in some cases this may happen even for programs that are extremely simple and small. A few of them may in fact be so slow to start that they actually have loading screens. Yeah, loading screens for applications.

Why do they take so long to launch, rather than appearing literally instantaneously on screen the millisecond you launch them?

How about using them? Some programs may be really responsive and present no discernible lag between a keypress or mouse button click, and the action it's supposed to do. Yet others may present noticeable lag sometimes. In the worst cases it may be so bad that if you, for example, write test fast enough, the program will lag behind and not keep up with your typing. Clicking on a menu option that opens up a dialog may inexplicably have a noticeable delay before it shows up. In many drawing programs, if you free-hand draw something (with the mouse or a touch drawing pad) fast enough, the program might not be able to keep up, and the result not correspond to what you drew. (In fact, this is one of the tests that drawing program reviewers do in order to see how responsive the program is.)

There is no reason why any of these programs should be so laggy. Most programs should launch instantaneously, and be responsive in the millisecond range, not in the tenth-of-a-second range.

Consider video games. While most of them do take a significant amount of time to load (which in many cases is actually justified given the literally gigabytes of data that they need to load from disk to RAM, and the amount of data processing they have to do immediately after loading it), the gameplay itself will most often be extraordinarily efficient. Sure, graphics hardware helps them draw things on screen really fast, but there are tons of other things that the game has to do between every frame, and it only has an extremely limited amount of time to do it (for example if the game is running at 60 frames per second, it only has about 16 milliseconds to perform all calculations it needs in order to render the next frame). Many games have to do extraordinary amounts of calculations, on the CPU, every frame, including physics simulations, updating moving scene geometry (which may require a rather large amount of data to be changed eg. in the case of particle effects), AI simulation, and so on. And it only has 16 milliseconds (or even less, if your framerate is higher) to do it. Yet they are usually able to do it, and even have time to spare (with many games and modern CPUs the CPU may not even get warm, as the game will only use like 10-20% of it for all its calculations).

If video games are able to do so much calculation in 16 milliseconds or less, what's the deal with utility applications sometimes showing considerable lag for things that ought to be much simpler than what a game has to do?

We literally have supercomputers on our desktops. Supercomputers that are literally, without exaggeration, millions of times faster and more efficient than what people used in the 80's. Yet most programs in the 80's felt more responsive than many programs today. Sure, screen and color resolution has increased enormously, but that doesn't explain it. Video games are able to cope with your 4k display at 60Hz just fine. Why can't utility applications?

The reason is that the vast majority of software out there is not programmed with any sort of optimization in mind. Most developers, especially big software companies, don't care about optimization and efficiency. They prefer the easy and safe route. They use all kinds of scripting languages because they are easier and safer, and don't care how inefficient the language may be. And even then, they don't care one iota about optimization even when using these languages. Things that could be done faster and more efficiently (both in terms of speed and memory usage), they don't even bother optimizing. As long as it works, that's enough for them, no matter how slow it might be.

The suboptimalities run at all levels. They don't use the most efficient algorithms for a given task, they don't use the most efficient data containers (in terms of speed and/or memory usage) for a given task, they don't care how much memory or disk space something takes and never optimize it to take less (even if it would be perfectly possible) and are extremely wasteful when it comes to memory usage and processing time. And rather obviously they don't care about low-level optimization either (such as CPU cache usage efficiency).

There are probably many things that could be done in one tenth of RAM and ten times faster... but they don't bother. Many developers working for these companies might well not even be aware at all that the thing can be done much more efficiently. They are extremely wasteful in memory usage and the way in which things are calculated, and don't care. They might even not know any better. It's the easy and lazy way.

Game engines can't afford this kind of laziness and inefficiency, which is why they are constantly being optimized with the truly most efficient algorithms, data containers and low-level optimizations, in order to squeeze every single clock cycle in order to achieve that golden 16-millisecond-or-less time it takes to calculate everything every frame. That's why they usually hire programmers who actually know what they are doing.

Your average word processor or spreadsheet? Not so much. They hire whoever they feel like, and don't care if the resulting code is literally thousands of times slower and memory wasteful than it would have to be.

Comments