Site icon The Handbuilt City

Nvidia: The Cards, The Hype, and the Cost-Effectiveness.

I spent part of the winter shopping around for a good computer setup that could handle the withering demands of Microsoft PowerPoint and Word plus the occasional dabbling in AI business. Wisdom gleaned from the darkest corners of Al Gore’s internet suggested that Nvidia’s cards do indeed have some advantages over their AMD or Intel counterparts. But it’s a peculiar market, so I figured I’d dig into some numbers to try and make sense of it.

An Nvidia RTX 4090 GPU, in all its precision dressing. Photo by ZMASLO, © CC BY 3.0.

Market: Symmetrical Enough Information, Supply-Demand Mismatch

To the uninitiated, we live in the age of the GPU, or Graphics Processing Unit. This is not just about teenagers getting their wealthy parents to buy them sick gaming rigs that light up in 17 million colors through addressable RGB power cables and fans that color-coordinate to software algorithms. It’s also, of course, a part of the hype around cryptocurrency and artificial intelligence. These involve a large demand for calculation-intensive, parallel computing processes that a regular old processor cannot handle on its own nearly as effectively as the GPU. While designed primarily for graphics– á la sick ass gaming rig- it is now suitable for a variety of applications well beyond that.

This has led to a market that is completely dominated by one company, Nvidia. The company has been making chips for decades, but it is only in the cryptocurrency and AI boom that it has become a household name– and a multi-trillion dollar cap company. Fanboys (and it is definitely a sausagefest) argue that it’s fine for one company to effectively have a monopoly (as much as 9 in 10 GPUs are Nvidia) because Nvidia doesn’t actually manufacture the chips, it contracts with someone else to make them, which means it’s, you know, theoretically possible for other participants to be involved in the market. Or something. I don’t know. I was always, as a good capitalist, raised to believe that competition is good.

AMD’s chips are thought to be good, but popular discourse suggests that they’re not as good at raytracing as they are at rasterization. Again, for the average user, this doesn’t matter. AMD cannot keep up and ends up with 10% or less of the market. Intel, then– which has a huge share of the CPU market- has a new line of GPUs that are supposed to be the bomb dot com. But no one can seem to find them anywhere as they were just announced, like, last month. So, we have supply issues for those, the same way we have supply issues for Nvidia’s top-of-the-line chips, or, you know, other chips.

Why Does Economic Competition Matter?

Tech companies are bad, generally. They lie to the public and they manipulate information. They also enable the manipulation of information through things like AI-generated content. I’m not writing here to say that Jensen Huang has blood on his hands for the way disinformation was wielded to basically rig the past few elections– that would be reductive and I don’t have the ability to support such a bold statement. But I do believe that the working class has very little in common with the interests of multi-trillion dollar corporations.

There’s also a less Marxist, more pro-capitalist argument to be made here, and that’s that the trillions allocated to Nvidia via market capitalization could better be allocated somewhere else. As the size of a company increases, and as its market share increases, its capacity to innovate rapidly diminishes. After all, what innovations have come out of the biggest companies in recent years, as opposed to from much smaller startup enterprises? Diminishing marginal returns plus the fact that so much effort is spent on continuing to aggrandize wealth in a society that can’t be bothered to, you know, regulate giant companies or tax the wealthy.

I could go on forever, baby! But it’s not about this, it’s really about the fact that putting all of your eggs in one basket is never a great idea. Everyone who told me to buy Nvidia when it was already well over $100 a share was probably flummoxed today, as Wall Street analysts were, when the stock lost 6.22% of its value after a positive announcement at CES. I guess this is what happens when a stock is trading at 30x price-to-sales. That’s a recipe for any hiccup to turn into a major correction. In other words, you may think the stock is fairly valued– but it’s trading as though it’s going to continue to make the same rate of profit for the next 45 years. Even if it doubles its profit in the next couple of years (thoroughly unlikely), it’s just as thoroughly unlikely to double its market capitalization, accordingly.

In the era of miniaturization, personal gaming computers have become more elaborate and elegant, usually including a minimum of several components that can be color-coordinated with addressable RGB lights and other effects. (Midjourney AI).

Which GPU Do You Need, Then?

Back to the point here. Most people can get by with a far lower-powered unit than they think. Gamers are typically obsessed with maximizing frame rates, for example, because that creates the “smoothest” visuals in playing a game. But if you look at the Toms Guide list, you can get pretty respectable frame rates out of “older” tech! Of course, frame rates are but one calculation, and this butts up against other factors, ranging from network latency to screen refresh rate. Virtual reality is extraordinarily demanding, technologically. Instead of going after whatever All The Cool Kids Were Talking About, I decided to make a purchasing decision based on a balance between what I could afford and maximizing the performance metrics per dollar spent.

The great thing about the fact that every nerd loves talking about their GPU is that we have bazillions of benchmark calculations available to us from websites like GamersNexus or Passmark. These allow relatively consistent comparisons. I divided the benchmark calculation by the cost of the GPU and then compared different models.

AMD Radeon models: 30s to 40s ($699-799).

Intel models: Arc A770 (13,075; 50 pts/$)

Nvidia models: 3060 (16,959; 58 pts/$); 4060 (19,851; 58 pts/$); 3080 Ti & 4070 (27,094 & 26,967; 49 pts/$); 4070 Super (30,164; 48 pts/$).

By the time I got to the 4080, 4090, and even some of the higher-end 30-series models like the 3090, the number of points per dollar went from the 40s or 50s into the 30s or lower (17 points per dollar for the 4090 if you’re paying $2k for it). Translation? Not worth the price. Not remotely.

Another calculation that is missing from the points per dollar thing? The higher-powered units use an insane amount more power. The physics behind this are actually quite interesting! But suffice it to say that with current microprocessor architecture, there is a massive, diminishing marginal return from simply running more power through a unit, whether in terms of overclocking or in terms of packing more cores into roughly the same architecture of unit. The 4090, for example, is rated for 450W TDP (Thermal Design Power) , which is more than twice the maximum rating of the 4070 Super. But the 4090 only cranks out about 25% more performance than the 4070 Super. Imagine that you have a 1200-watt power supply and two monitors– that’s perilously close to the maximum capacity of a residential outlet, so this quickly becomes impractical.

 

Conclusions

There are niche use cases for why one would need to have the highest-priced unit, I suppose. But most consumers do not, certainly. And if we’re thinking about the possibility of parallel processing– which may involve utilizing not just one GPU to process a set of instructions, but multiple GPUs- this suggests that a GPU like the 4090 might actually not really be worth that much if you can get more computing power out of two chips using the same amount of power as one less energy-efficient one. Saving money by choosing a cheaper option will be better for lower power usage as well as for saving thousands of dollars.

Of course, I still have to contend with oodles of Nvidia fanboys who think that These Chips Will Change The World. They might be right! It’s just unclear exactly how they will change the world. More AI-generated personas on social media that no one asked for? Disinformation from the fascist regime? We can only hope it’s, you know, maybe something that will actually be useful for society.

Nvidia controls about 88% of the GPU market, which is great for Nvidia. It’s not necessarily great for consumers in the long run, but that doesn’t mean they don’t make some great products that are also cost-effective for consumer use. (Midjourney AI).

Disclosure: the author does not own any stock in Nvidia, nor does he have any intention of opening long or short positions in the stock in the coming days.

Exit mobile version