In keeping with my blog’s name (Total Rip-Off), I am yet again plagiarizing myself and reposting a forum message that I wrote on (that’s actually a pretty good site to find Canadian deals, actually).

So without futher ado, my thoughts on NVIDIA’s recent launch on Friday of the GTX 480 and 470 graphics cards, from the perspectives of performance, technology, and business.

NVIDIA’s GTX 480/470 vs. AMD/ATI’s options

I think the general consensus amongst review sites is that the GTX 480/470 series are disappointments. The GTX 470 is better positioned as a product (due to its lower price), I think, than the GTX 480. However, they both produce a LOT of heat and noise and draw too much power. Even though they’re meant for high-end enthusiast use, it’s going to require careful casing, air flow, and power supply selection – especially if you want to run two cards together in SLI.

More importantly, however, I think that the performance increase of the GTX 480 (~10-15%+ in typical gaming benchmarks) over the Radeon HD 5870 is a lot less than what NVIDIA needed it to be. Right now, you can get the Radeon HD 5870 for as low as $435 CDN from NCIX, while the MSRP of the GTX 480 is supposed to be $499 USD. If the GTX 480 sells at its MSRP, I think it’s a reasonable option vs. the HD 5870 if you want to pay for the extra performance. However, initial availability for the GTX 480 is going to be very low it seems, which I think will lead to higher initial prices. Furthermore, AMD/ATI has room to cut their prices quite a bit – the HD 5870 launched 6 months ago with an MSRP of $379 USD – while I think NVIDIA is backed against a wall (price-wise), for reasons I will outline next…

The GTX 480 and 470 are HUGE chips and very expensive to fabricate. They’ve got almost 50% more transistors than the HD 5870, and since they’re both fab’d at TSMC @ 40nm, I estimate that directly translates into about a 50% larger chip area. So, already, the GTX 480 and 470 could be expected to cost perhaps 50% more to make. Couple that with the fact that NVIDIA probably has more significant problems with yield compared to AMD/ATI, which has been fabbing at 40nm through TSMC since the Radeon HD 4770 was launched, not to mention the last 6 months of producing the HD 58xx series. And add to THAT the fact that a 50% larger chip means that your yields might go down as well since you have fewer chips per wafer (this is assuming a constant number of defects per wafer).

So, again, NVIDIA’s chip is huge and expensive to fabricate. That’s why I believe that while AMD/ATI has lots of room to cut prices, NVIDIA probably doesn’t.