This is all just rumors at this point, but they are the sort of rumors that historically have been tended to be right. Anyway, it appears that at this point AMD has showed no interest in adding hardware raytracing support to their graphics chips and cards, at least for the near future. I suppose they don't simply want to jump on the bandwagon and just copy what their biggest rival is doing, and instead want to wait and see how the technology develops, and how well the gaming industry starts supporting said technology.
I think that this might be a somewhat of a small mistake. Perhaps not entirely from the point of view of AMD, but directly or indirectly from the point of view of Microsoft and Sony.
You see, for one reason of another (I don't really know why; perhaps because of getting very good business deals) both Microsoft and Sony have partnered with AMD in order for the latter to provide the graphics chips for their consoles. (Both the PS4 and the Xbox One lines of consoles have slightly customized graphics chips by AMD.) It's almost completely certain that any upcoming consoles (for which there are already strong rumors about) will once again have AMD graphics chips in them.
Now, one aspect for which consoles are often derided is that they always feel to be behind the curve when it comes to graphical prowess, compared to the average gaming PC. Not even compared to just the most expensive top-of-the-line ones. When these consoles are published, in terms of graphical and computational prowess they tend to be on the mid-to-slightly-top tier, compared to your typical gaming PCs, but quickly fall below average in just a year or two.
Well, here there would be a golden opportunity to make a leap ahead. If AMD were to implement strong hardware raytracing support for their upcoming graphics chips, especially the ones to be used in the next generation of consoles, this could be a selling point.
Currently there are rumors (likewise quite likely to be true) that the RTX 20 series of cards by Nvidia will have hardware raytracing support only on the high-end cards, ie. the 2070, 2080 and 2080 Ti cards, while the mid-to-low end cards (2060, 2050 and lower) will either have very limited raytracing support (ie. only a small fraction of the RT cores) or no support at all (ie. no RT cores whatsoever).
If this is true, this means that for years to come hardware raytracing will be a thing pretty much exclusive to enthusiasts, not the mid-range gamers. Meaning that only a small fraction of all gaming PCs out there will have raytracing support, for many years to come.
So what a golden opportunity for consoles! Imagine the PlayStation 5, and whatever the possible next Xbox One console might be, having very strong hardware raytracing support from the get-go! Suddenly consoles would offer something that your average gaming PC doesn't!
Suddenly games would actually look better on console than on PC! Probably for years to come.
But no. AMD will miss the opportunity, and with them, Sony and Microsoft. Such a golden opportunity wasted. What a shame.
I think that this might be a somewhat of a small mistake. Perhaps not entirely from the point of view of AMD, but directly or indirectly from the point of view of Microsoft and Sony.
You see, for one reason of another (I don't really know why; perhaps because of getting very good business deals) both Microsoft and Sony have partnered with AMD in order for the latter to provide the graphics chips for their consoles. (Both the PS4 and the Xbox One lines of consoles have slightly customized graphics chips by AMD.) It's almost completely certain that any upcoming consoles (for which there are already strong rumors about) will once again have AMD graphics chips in them.
Now, one aspect for which consoles are often derided is that they always feel to be behind the curve when it comes to graphical prowess, compared to the average gaming PC. Not even compared to just the most expensive top-of-the-line ones. When these consoles are published, in terms of graphical and computational prowess they tend to be on the mid-to-slightly-top tier, compared to your typical gaming PCs, but quickly fall below average in just a year or two.
Well, here there would be a golden opportunity to make a leap ahead. If AMD were to implement strong hardware raytracing support for their upcoming graphics chips, especially the ones to be used in the next generation of consoles, this could be a selling point.
Currently there are rumors (likewise quite likely to be true) that the RTX 20 series of cards by Nvidia will have hardware raytracing support only on the high-end cards, ie. the 2070, 2080 and 2080 Ti cards, while the mid-to-low end cards (2060, 2050 and lower) will either have very limited raytracing support (ie. only a small fraction of the RT cores) or no support at all (ie. no RT cores whatsoever).
If this is true, this means that for years to come hardware raytracing will be a thing pretty much exclusive to enthusiasts, not the mid-range gamers. Meaning that only a small fraction of all gaming PCs out there will have raytracing support, for many years to come.
So what a golden opportunity for consoles! Imagine the PlayStation 5, and whatever the possible next Xbox One console might be, having very strong hardware raytracing support from the get-go! Suddenly consoles would offer something that your average gaming PC doesn't!
Suddenly games would actually look better on console than on PC! Probably for years to come.
But no. AMD will miss the opportunity, and with them, Sony and Microsoft. Such a golden opportunity wasted. What a shame.
Comments
Post a Comment