NVIDIA's 3-way SLI: Can we finally play Crysis?
by Anand Lal Shimpi on December 17, 2007 3:00 PM EST- Posted in
- GPUs
Final Words
NVIDIA always does this. We got Quad SLI with the 7950 GX2, only to be replaced shortly thereafter by G80, and now we're getting 3-way SLI with the 8800 GTX/Ultra, which we all know is on the way to being replaced by G92. Investing in a 3-way SLI setup today would be a terrible idea, you're buying into old technology and you're buying it after it's already been made obsolete by a new GPU. It's only a matter of time before G92 makes its way up the food chain, and three of those bad boys with even more shader power should give us a much cooler running, and faster, 3-way SLI setup than what we've tested here today.
The setup works, we didn't run into any software issues, and we can't deny that there are some definite performance improvements in certain games. The problem is that 3-way SLI just doesn't scale well enough, in nearly enough titles to justify the price.
We'd love to say that 3-way SLI is exactly what you need to play Crysis, because at least that way there'd be a configuration in existence that would run that game well, but we just can't. The game currently doesn't scale well at all from two to three cards.
And that's the fundamental problem with 3-way SLI, it's a better effort than Quad SLI was, but it's doomed from the start: it's built on old technology. We'd much rather have a couple of faster G92 based GPUs than SLI-ing three 1.5 year old GPUs together.
Then there's the bigger issue of SLI and CrossFire technologies in general, scaling is a little too dependent on software. You're increasing the execution resources of a standard 2-card SLI setup by 50%, but the performance impact is no where near that. Whereas if you added 50% more SPs to those two 8800 Ultras you'd see a much more tangible outcome. It's an extreme version of the way Intel makes quad-core CPUs, but instead of sticking two die on a single package, you have two die spread over two cards - that's hardly efficient. GPU architectures have changed dramatically over the past few years, yet we're still left with the same old multi-GPU technology. It's time for a change.
48 Comments
View All Comments
BigMoosey74 - Tuesday, December 18, 2007 - link
Thanks for calling it how it is. The final comments are so true it isn't even funny. All of the fanboys need to come to their senses...this is a real inefficient technology, both crossfire and SLI. The theoretical gains vs the actual gains highlight a serious problem with this design.Think of the physics behind it. No matter what process you are embarking upon, when ever you split something into more pieces you loose efficiency. Yeah so having two cards boosts performance a little...but nothing ground breaking. Having two GPUs running should give 2x performance gains hands down with no exceptions. The 3rd card is a dead weight for some games...how the heck could a company stand behind that as a successful solution? Don't p*ss down my back and tell me it is raining.
I 100% agree with the author...we need something new that is actually worth it. This "lets add more cards" solution is a junk marketing scheme. ATI/nVidia need to work on something huge rather than waste time with crossfire and SLI. The GPU technology needs a change as what the CPUs saw with the quad core...more performance, higher efficiency...not a slap in the face with 30% performance, with 3x power consumption and 4x $$$.
solgae1784 - Tuesday, December 18, 2007 - link
Hothardware is saying Crysis has a bug on Multi-SLI, so they're expecting a patch to enable multi-SLI feature. Could be the explanation on why there's no gain on going 3-way SLI.Zak - Tuesday, December 18, 2007 - link
No way I'm spending >$1,500 on three video cards just to play a game. All I want is one good $500-600 card that can play Crysis and other newer games at 1920x1200.A.
Zak - Tuesday, December 18, 2007 - link
And I feel like Pirks above. I already have a Mac for everyday use, switched from Windows after trying out the Vista trainwreck (I used to be a Mac user years ago). I've built a $1,500 PC two months ago just to play games and I find that despite having the second fastest video card (8800GTX) I can't play a lot of latest games well at 1920x1200. And what's Nvidia's solution? $1,500+ 3way SLI. You know what they can do with that?! I'm seriously considering dumping the PC altogether and getting an Xbox too. PC gaming is going downhill quickly, it's getting way too expensive and too frustrating. At this rate it's just a matter of time before all the good games come out for consoles only. I walked in to Game Stop the other day: "I'm sorry but we don't carry PC games any more at this location". Uh? It's like being Mac gamer all over again!cmdrdredd - Tuesday, December 18, 2007 - link
I feel you on the PCgaming thing. I myself have an Xbox360 and play it much more than I do my PC.I find it pathetic that the best this super expensive top end system can do with crysis maxed out is 43fps.
Zefram0911 - Tuesday, December 18, 2007 - link
IS my RAID array from my evga 680i going to be messed up if I upgrade to the evga 780i board?I'm doing the "step u" that evga offered for the 680i's.
madgonad - Tuesday, December 18, 2007 - link
Just leaving that thing plugged in will cost over $400 in electricity alone. And that is only factoring the computer at idle.Pirks - Tuesday, December 18, 2007 - link
Okay, so you guys probably read already that in the North America Crysis only sold about 33,000 copies, which is a total sales flop. Does it feel like hi-end video cards are finally moving into very narrow niche with people moving to Xbox 360? First Orange Box from Valve, then Gears of war from Epic, then Crysis, and The Darkness (no PC version and not even talk about making one), and The lost planet, etc etc... I had very bad feeling about Crysis, too bad this feeling was not unfounded.And, well, right now you can get Xbox 360 for $250 (yeah, with coupons and if you're lucky, but... still...), so I don't know guys, I see mass consumers just shying away from hi-end 3D cards more and more ($250 for Xbox 360 and cheap 720p HDTV or $500 for hi-end nVidia card? hmmm... now even _I_ start to think about it), I've heard numerous complaints from my gaming buddies that a lot of PC ports of console games are not.. er.. very high quality (for instance in Gears of war the Hammer of dawn is a joke compared to Xbox version, no Collector's edition for PC, etc etc - many things in console ports look like shit on PC, don't even start to remember Halo 2 PC port, puukeee :bleeeeaaaah: [vomiting violently])
I don't know about you guys but I see Mac and Xbox 360 coming, marching forward, they are just simpler and more for the dumb people so we enthusisats are going into extinction slowly but surely. Newegg now sells electronics, cameras, kitchen stuff and bread machines (I gonna buy one for my wife there BTW, and maybe air conditioner too). I'm not even sure about Mac anymore - maybe I'll get myself a Mini, and leave PC for occassional gaming, IF there is a decent game on a PC coming out. Time to try that Mac/Xbox 360 combo my buddies keep drooling about.
No, I'm not trolling, just talking about my own personal observations. You're more than welcome to criticize and downmod me, guys, I'd love to be wrong on that, actually
andrew007 - Tuesday, December 18, 2007 - link
You're not the only one. I'm also thinking of getting that rumored ultraportable Mac early next year too and I already have the consoles - and I am very happy with console FPSes on Xbox360. When it comes to games recently the only good exclusive games for PC are RTSes (and MMORPGs) which aren't really my cup of tea. So no, you are not the only one, and I am for one glad about low sales of Crysis. Personally, to me it was the most disappointing game of the year. Anyway, while I'm sad the era of plentiful, deep games is gone (where are space sims? deep RPGs? great adventures like Gabriel Knight? ANY game with good length AND depth?) - it is what it is. It's probably no coincidence that there were many independent studios at that time, now all gobbled up by large conglomerates. When I turn on a console, I am certainly not missing game freezes and crashes such as ones due to factory overclocked (!) memory I'm getting on my 8800GT (had similar issue with 3 consecutive video cards in 2 years now), copy protection schemes (had to change DVD drive for one game!), occasional mandatory beta video drivers and just general fuss and instability. Sure, my overclocked Q6600 is insanely fast if I run photo editing or video encoding or even web browsing, and the games do look great in full resolution - but that's only when everything fully works which is not as often as I'd like. I'm getting too old for this. The only bad thing is that console makers are now adopting PC ways - patches, game freezes, controller dropouts, overheating, noise...Pirks - Tuesday, December 18, 2007 - link
Yeah, I think I'm getting older too - I just want damn computer to work and looks like I gonna get Mac, 'cause I know self-made PC is waaay cheaper, but somehow I'd pay for peace of mind, service etc - just buy Mac with 3-year warranty, you pay a lot but I heard they generally work okay, so... and I still gonna use PC as a second machine - would be fun to compare them and see what each is good for. And as for games you're right, big EA-like publishers are killing inventive original games of the past (American McGee's Alice, Medal of Honor, MDK, Dune 2, UFO, Descent, and many many others) but if publishers are going to pour money in the console market - I better get a console. Games are not going to be great, I agree, but at least they will be CHEAP, compared with nvidia $500-a-year "tax". Anyway, I'll keep PC around and maybe even upgrade it if the decent PC game comes up. Actually I'm waiting for Fallout 3, may be a good reason to upgrade, who knows