Maybe Skip This Generation

Yesterday was the big Keynote from GTC… which is a conference that Nvidia essentially made up in order to have a venue in which to sell their graphics cards. One of the hot debates from yesterday was whether or not CEO Jensen Huang was an AI character and rendered in real-time… seeing as the last generation he did the entire demo in a virtual environment. The larger talking point however was the price tag associated with this generation. As is often the case Nvidia focused entirely on the highest end of their graphics cards, namely the 4080 and 4090. For those who don’t remember the “90” series came on board the last generation and has effectively replaced the Titan nomenclature for their extremely high-end cards that are not necessarily targeted at gaming. The products announced yesterday:

  • GeForce RTX 4090 24GB – $1599.99 MSRP
  • GeForce RTX 4080 16GB – $1199.99 MSRP
  • GeForce RTX 4080 12GB – $899.99 MSRP

From there you can expect board partners to release variants ranging from lower ram versions that are likely cheaper and cards with extra features that will cost more than the founder’s edition cards. One thing that will be interesting to see however is how the lineup of third-party cards shakes out now that EVGA has decided to stop producing Nvidia cards… and graphics cards entirely. Based on some very terse comments released around that news… it seems that board partners are often losing money on the higher-end graphics cards due to the chip costs set by Nvidia, and the price ceiling placed on product families.

If you compare the pricing of the last several generations, you can see that the lowest-end version of the 4080 is an almost 30% increase over the cost of the MSRP of the previous generation. Unfortunately, as we all know too well, it was almost impossible to find a graphics card during most of that generation for anywhere close to that price point. The pandemic happened and made the market go wonky… with issues in the supply chain followed by an increased demand brought on by a boom in gaming. This was only increased by the fact that so many set out of the 2000 series completely due to a similarly high 16% price increase over the previous generation.

I lucked into buying a reasonably priced prebuilt system for my birthday last year. It was a good call for me personally because I needed to completely refresh my system, as I was still using a 5th gen Intel platform. However at least part of my logic behind the purchase was that if anything went really wrong, I could at a minimum flip the graphics card and make more money than I paid for the system. At that point, I had checked Ebay and the 3080 was selling through at around $2500 each. However, a lot of things have changed since then. Firstly the supply chain issues have cleared up a bit, and the demand for chips has lessened to the point where most card manufacturers have cards in stock. Combine this with some very public crashes in cryptocurrency and the recent move of Etherium from proof of work to proof of stake… and the third-party market is deluged with used cards. If I were careful I could probably pick up a 3080 right now for under $500, which is a significant change in the market.

There is also the problem that a lot of the features that are being added to these new RTX cards are not actually being used by the bulk of gaming. Ray tracing has yet to really take the world by storm, and Nvidia banked during the 2000 series that gamers would favor higher resolution gaming as opposed to higher framerate gaming. In February of 2021, Steam passed 50,000 games listed on the platform and available for sale. There is a curated list of all of the games that feature “RTX On” support and right now currently that list only contains 132 games. While Nvidia keeps pioneering new AI features on their cards… it is highly unlikely that we are going to see the benefit of them anytime soon. Sure I love the AI ability to knock out background noise on my microphone or clip out the background when I am on a video call but I am not running any heavy processing routines on my card. Instead, I am still spending most of my time running games at 1080p or 1440p at which point I favor framerate over raw rendering detail.

Don’t get me wrong… I think a lot of the things demonstrated in the keynote were extremely cool. However, I also think that most of those things don’t really factor into my usage pattern for the cards. Nvidia has gone hard on AI research and simulations, and the vast majority of its presentation was focused on that market. Gamers are no longer the key demographic that they are chasing as a company and likely have not been for a very long time. So my advice would be that unless you are one of those folks who just have to have the newest and shiniest thing… maybe you should skip this generation of graphics cards entirely. The price point is tied to an artificial anchor of demand that is not going to hold up in the long run. That price is anchored to the eBay highs of the pandemic and a desire to squeeze more profit from the consumer as a result.

You can snap up some pretty reasonable deals in the after-market right now on 3000 series cards, and that is honestly more cards than is needed to get you through to the next major graphical update. If you follow the trends, 4k gaming has not really taken off as anyone had hoped for either. As I said before gamers tend to be favoring running games at a lower resolution but 144hz or higher frame rate. Right now mining cards are flooding the market because it is no longer profitable in the least to run a graphics card setup. There have been numerous videos covering the fact that so long as the cooler is still functioning properly, it is perfectly fine to buy a mining graphics card for gaming performance.

The 2000 series was the last time that gamers largely gave a generation a hard pass, and it was not necessarily for the same reasons. While there was a much larger jump in price point, it was more a case that the performance increase was not all that significant over the 1000 series. The 4000 series on the other hand seem to be a pretty massive leap in performance over the 3000 series… but it isn’t performance that we really need yet. The price point of 4k high refresh gaming is still pretty steep when it comes to monitors that are largely still in the $700 range. Whereas you can pick up a 1440p panel for around $200 and at the most popular sizes of around 27-inch displays… there isn’t much noticeable difference between the two. You really need to get up into the 40-60 inch display range before 4k has a clear advantage over 1440p.

Basically I think the 4000 series is really cool, but way to costly for what it is giving us. Get a cheap/used 3000 series card and call it good and wait this generation out.

Update – 9/21 4 pm

When I made my post this morning I did not have all of the information, or at least I took some things for granted. If you have two cards that are 4080s… and announce them as the 16GB version and the 12GB version, I go into that assuming that is the key difference. They are apparently just completely different cards, and today there has been a lot of speculation that the 4080 12GB was originally intended to be announced as the 4070. Why this matters, is that the 4080 12GB is essentially a worse card than the existing 3080 cards. While the boost clock is higher, the RTX 4080 12GB only has 7680 CUDA cores, whereas the existing 3080 series has 8960. That is a difference of over 1280 CUDA cores, which seems at least on paper to be a significant loss in horsepower as compared to the current generation. I am not sure if the clock and memory speed differences make up for it, but it does not look great.

That also means that the true generational price comparison is not that 12GB thing being called a 4080, but instead the 16GB model that has the much higher CUDA core count or 9728. That also means that the price difference between a 3080 and a 4080 then is an over 70% increase as opposed to the 30% mentioned earlier. This honestly just keeps looking like a worse deal, and I again stand by my statement that you really should be looking at getting a 3000 series card while they are dropping in price with the incoming wave of new cards, instead of looking at the 4000 series.

4 thoughts on “Maybe Skip This Generation”

  1. I have weird rubberbanding issues with my set-up that my wife does not (newer rig) that can’t pin down, b ut suspect are due to my aging 1070.

  2. This generation feels like Nvidia is absolutely hoping to dump the 30 series stock by making it look like a steal in comparison, but at the same time, some of the stuff is interesting. For gaming, I have little interest in upgrading from my 3080 (not to mention that custom watercooling means an upgrade is a mega-hassle) but for some other stuff, the new features of the 40 series are compelling. If the mega-increase in RT performance works through the Optix renderer in Blender, for example, it would be a good upgrade for my 3D rendering.

    But the pitch they made yesterday is pretty unappealing for pretty much everyone it feels like, and the stripped down specs of the “4080” 12 GB is such a dumb thing that Nvidia loves doing (they had a barebones GT 1030 card that had two different types of RAM depending on which model you bought, and one was WAY slower but there was little or no marking on the package to tell you which one you even got). My hunch is that they are going to flush the 30 series stock by making it look like a steal in comparison and then finally when we reach mid-gen refresh status in a year and change, they’ll launch high-margin (to Nvidia at least) Ti cards and slash the prices on yesterday’s announced cards.

    • I was still on the fence about grabbing this generation even after the 4080 12GB aka 4070 cynical cashgrab of a name change, but info I’ve seen today on the cable woes stemming from the ATX 3.0 jump, leading to cables capable of only very few cycles (rated for only ~30 connect/disconnects) after which melting if not outright fire becomes an issue and eesh.

      Still very much a watching brief at this point. I was lucky enough to get an RTX 3080 and ‘only’ pay RRP for it on launch day of the cards, so realistically I have no need to jump on this gen. Every 2nd gen (or even 3rd, on occasion) is my usual approach anyway…

      …But still…

      I want it. 😉

  3. The xx80 series have always been way more than I’ve been willing to spend, and even the xx70s were pushing it, so I’m much more interested in the 4060, and how it compares to the AMD performance equivalent. My RX470 has done just fine over the past 5 years, but it really is getting on a bit now, and doesn’t do great on my new 1440p 27″ monitor so it might well be time to upgrade. Given I spent £190 on that card, I feel like I got a great deal 😀

Comments are closed.