When the Smoke Clears & Prices Settle: GeForce 9600 GT vs. Radeon HD 3870
by Anand Lal Shimpi on February 22, 2008 12:00 AM EST- Posted in
- GPUs
NVIDIA GeForce 9600 GT vs. ATI Radeon HD 3870
Across 9 games and 12 benchmarks the Radeon HD 3870 and the GeForce 9600 GT trade blows. If you want specifics, the 3870 wins 7 benchmarks while the 9600 GT wins 5. However, whenever the 9600 GT manages a win it is usually by a larger margin - an average of 24% across our benchmarks compared to a 9.9% average margin of victory for the Radeon HD 3870.
The 9600 GT's average margin of victory in the games it does well in is so great mainly because of two titles: Quake Wars and Call of Duty 4. The strange thing is that we've seen ATI GPUs do better in both games, only to see performance go down in the last driver update. Quake Wars also recently got updated to the 1.5 patch so it's possible that the new patch also slowed things down for the Radeon HD 3870, but we suspect that both of these performance outliers are driver related and can be remedied. If ATI could achieve performance parity in these two titles that would reduce the 9600 GT's average margin of victory to 8.9%, very close to the 3870's current advantage in the benchmarks it does win.
However we must recommend based on presently available data, and right now it looks like the GeForce 9600 GT is the better buy. It's cheaper than the Radeon HD 3870 and offers a better overall performance case thanks to its larger margin of victory when it comes ahead in a game.
If you look at the cheapest available Radeon HD 3870 ($184.99 from Newegg) then the 9600 GT price advantage all but disappears, and if you don't play Quake Wars or CoD4 then the 3870 ends up being just as good of an option as the 9600 GT.
44 Comments
View All Comments
dingetje - Saturday, February 23, 2008 - link
i would be very interested to see how the 9600GT 512MB stacks up against the recently released 8800GS 384MB , especially regarding overclockingZak - Friday, February 22, 2008 - link
I wish I had put my 8800GTX on eBay couple of weeks ago. Now I'm going away on vacation for 3 weeks. Perhaps when I come back the 9800GTX or 9900GTX will be out:) I wonder if it'll allow Crysis on Very High at 1920x1200. Maybe I'll give Vista a try again if SP1 comes out about the same time!Z.
Imnotrichey - Friday, February 22, 2008 - link
In yesterdays article, it said the Crysis benchmarks were ran at medium quality and got 41.5 FPS, and in this article it says its at the high quality settings.Is it the medium or high quality settings?
Im guessing they are both supposed to be medium considering my 8800 GTS gets pretty choppy at high settings.
Thorsson - Friday, February 22, 2008 - link
One of the most important pieces of info is totally missing. Which card overclocks better?Aberforth - Friday, February 22, 2008 - link
Too bad....in a few months we are going to see most demanding games ever made and then you can decide whether this is a mid-rage or low end. Future geometry shaders will prolly kill this card one day :DBigLan - Friday, February 22, 2008 - link
Any word on how stock 9600's overclock? Is it realistic to think you'll be able to match the speeds of the overclocked cards, and if so, is the maximum overclock on stock and factory-overclocked cards the same?I really don't see the point in paying a premium for factory overclocked cards, you can do it yourself for free and very little risk (just overclock in small incremements and you won't fry your card.). Paying for a better cooler makes much more sense imo.
OrSin - Friday, February 22, 2008 - link
Not sure what WoC is, but we get a few RTS games. We got 10+ FPS. And Oblivion and Bioshock might as well be FPS to me.When is the industry going to try to do something else. It seems we have to wait for Blizzard to come up with SC for people to jump on some other bandwangon.
kilkennycat - Friday, February 22, 2008 - link
Seems as if the forced price cuts on the 3850 and 3870 are likely to have reduced AMD's profit on the associated GPUs suddenly to zero. I cannot see their board partners being at all willing to swallow any part of that price cut. I suspect that AMD might have had to rebate a large part of the GPU costs to their board-partners for boards already in the pipeline to have them go along with the drastic price-cut.nVidia has 1.8 billion in the bank and AMD is ~ $3 billion in the red. nVidia can sell their GPUs to their board partners with a very low profit margin for a very long time. The forced price-cuts by AMD to match the nVidia partner prices bleed away AMD/ATi's ability to recover the huge development costs on their silicon. No development-cost recovery - can't self-finance future development. With Intel on one side and nVidia on the other and both successfully squeezing AMDs profit margins on all of AMD's new products, time is not on AMD's side. I suspect that AMD's propects for borrowing more money for product development are becoming very limited indeed with the current world-wide banking turmoil. (For a glimpse at the costs of modern GPU silicon development, the nVidia families of GPUs that power the 8xxx series took about $400million to design and bring to production... )
DerekWilson - Friday, February 22, 2008 - link
The 55nm process offers a size advantage over the 65nm process. We haven't confirmed die sizes ... so ... I might be wrong (i'm not in town and i can't check), but I think the R670 is smaller than the G94.Maybe Anand can do a quick check?
*poke
Wirmish - Friday, February 22, 2008 - link
8800 (G80) -> 484 mm²8800 (G92) -> 324 mm²
8600 (G84) -> 173 mm²
9600 (G92) -> 240 mm²
2900 (R600) -> 450 mm²
38x0 (RV670) -> 190 mm²
Do the maths.