Skip to main content

Why is RTX experiencing abysmally slow adoption?

For some time now I have been contemplating purchasing an RTX card. The thing is... Why should I, really? At the time of writing this there are, in practice, only three games with decent RTX support.

I'm not counting Shadow of the Tomb Raider because the only thing where RTX is used in any way, shape or form in that game is for shadows, and nothing else, and they don't really make all that much of a prominent difference. I'm also not counting old games with remade engines that use RTX, such as Quake II or Minecraft. Nor am I counting Final Fantasy XV, because it uses RTX even less prominently than the Tomb Raider game.

That leaves in practice only three games, so far, with actual decent RTX support: Battlefield V, which I'm not really interested in (from what I hear it has a short and uninteresting single player campaign, and I'm not interested in multiplayer, and it's an SJW game, so fuck them), Control, which is an Epic Store exclusive game (so fuck them too, I'm not buying it from there, they can go fuck themselves), and Metro Exodus.

So that would therefore be only one game with decent RTX support that I would actually be interested in playing. Does it really make any sense to spend upwards to 1000€ for one single game?

Of course there are upcoming games with RTX support. A few of them. A couple of them I'm actually looking forward to, such as Cyberpunk 2077 and Doom Eternal. But they will come some time in the distant future. It may make more sense to wait for the next generation of Nvidia cards (which are rumored to come by the end of next year) and then contemplate a purchase.

But this raises a rather puzzling question: Why has RTX adoption been so abysmally slow?

The GeForce 20 series of cards were published in September of 2018. That's over a year ago now. There are, essentially, just three games so far with any kind of actual decent support. (The rest are either extremely minor and mostly inconsequential support, a couple of remade rendering engines for existing old games, and a few tech demos.)

Three. At the very most five, if we are really lenient in our definitions.

I can't think of any new piece of mainstream gaming technology that has had this slow of an adoption rate. No game console has had this slow adoption. If a game console had five games and a few tech demos for it by the one-year mark, it would be considered a total failure and disaster. Even the VR headsets had more actual games for them by the one-year mark than RTX.

The only saving grace that the RTX cards have is that they are "backwards compatible", in that they can be used to play just regular old (and new) games, and are a bit faster than their previous-gen counterparts.

Comments

  1. But it's kind of niche, isn't it?

    It's not generally available as an OEM option*, except in PCs advertised as gamer machines, so it has to be bought after market.

    Worse still, most people don't even bother with any card and just run the games they can on integrated graphics - and buy a console for the ones they can't.

    And even for those gamers that might be interested, does the improvement justify the cost?

    If there aren't games coming with RTX support, it's because gamers aren't demanding it.

    *from a short perusal of HP and Dell

    ReplyDelete

Post a Comment