NVIDIA's 3-way SLI: Can we finally play Crysis?
by Anand Lal Shimpi on December 17, 2007 3:00 PM EST- Posted in
- GPUs
Power Consumption
This was probably the most fun we had in testing for this review: measuring the power consumption of the 3-way SLI setup. We plugged the entire system into our Extech power meter and bet on how much power it'd use. Let's just say that NVIDIA isn't too far off with its minimum PSU requirements, see for yourself.
At idle, our 3-way SLI testbed drew around 400W of power. To put that into perspective, this is more power than any of our normal CPU or GPU testbeds under full load...just sitting at the Windows desktop. The third 8800 Ultra manages to pull an extra 100W without trying. Now let's see how much power this thing needs when playing a game.
We started with Crysis, which is normally our most stressful game test. Normally doesn't apply here though as Crysis didn't scale well from 2 to 3 GPUs, the third graphics card only improved performance by a few percentage points at our most playable settings, thus power consumption won't hit its peak.
Averaging 660W of power, the 3-way SLI system is now using over 2x the power of a normal gaming system outfitted with an 8800 GT. But it gets better.
Bioshock gave us the best scaling we saw out of the lot, and thus the 3rd GPU is working its hardest in this test, meaning we should be able to get our Extech to produce some stunningly high numbers.
And stunningly high we got. The 3-way SLI system averaged 730W in Bioshock, and get this, we even saw the machine pull over 800W from the wall outlet.
Configuration | Idle Power | Bioshock Load | Crysis Load |
8800 Ultra x1 | 217W | 329W | 337W |
8800 Ultra x2 | 300W | 475W | 520W |
8800 Ultra x3 | 388W | 730W | 660W |
I was talking to Matthew Witheiler, our first dedicated graphics editor here at AnandTech, I told him how much power the system used under load and while idle. His response? "JESUS". "No", I said, "not even Jesus needs this much power".
48 Comments
View All Comments
kilkennycat - Tuesday, December 18, 2007 - link
...it's far more likely to be used by a (nV) video card functioning as a GPGPU for either gaming --- or in the short-term --- professional desktop applications. nV is making great strides in the professional scientific number-crunching and signal-processing communities with their CUDA toolset running on their current GPU offerings. They currently own ~ 86% of the "workstation graphics" market, but in a rapidly-increasing number of cases, graphics is not the sole function of the current nV workstation hardware. Wait for nVidia's next generation silicon and driver software which will be far more focussed on seamlessly merging GPU and GPGPU functionality. Also, wait for their true next-gen motherboard chip-set and not the cobbled-together "780i" which will implement symmetrical PCIe2.0 on all 3 PCIe x16 slots. Arriving about the same time as their next gen GPU family. Mid-2008 would be my guess.aguilpa1 - Tuesday, December 18, 2007 - link
Funny how your review doesn't address this blatant issue. yes it will run tri sli but don't expect it to do with the same Yorkfield used on the test board they used. Engineering samples of the QX9650 ran fine on the 680i SLI's but were changed with the retail versions. Whether it was Intels pissy way of getting back at Nvidia for not licensing SLI to them or Nvidia's way of making a buck off of selling an almost already obsolete board (nehalems coming next year). At this stage...who cares.ilovemaja - Tuesday, December 18, 2007 - link
that quote: His response? "JESUS". "No", I said, "not even Jesus needs this much power".Is one of the funnyest things i heard in my live.
Thanks for another good article, you are the best.
acejj26 - Tuesday, December 18, 2007 - link
In Crysis, you say that the third card offers a 7% performance boost over the 2 card configuration, however, it is only offering 1 fps more, which is just about 2%. Those numbers should be changed.Sunrise089 - Tuesday, December 18, 2007 - link
Not complaining, but I've noticed the last several GPU articles have been written by Anand, which isn't his normal gig. On top of that we get a reference to another GPU editor from back in the day. What's up?compy386 - Tuesday, December 18, 2007 - link
I'd be interesting to do a comparision between SLI and Crossfire once AMD gets some drivers out that actually support quad SLI. I saw a board on newegg that looks like I'd fit 3 3870s as well.AcydRaine - Tuesday, December 18, 2007 - link
AMD doesn't support "Quad-SLI" at all. There are a few boards on Newegg that will fit 4x3780s. Not just 3.compy386 - Tuesday, December 18, 2007 - link
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.compy386 - Tuesday, December 18, 2007 - link
The 3870s take up 2 slots so I only see boards that fit 3. Most of the boards will take 4 3850s though. Again, I'd like to see the performance number comparisons for scaling purposes.SoBizarre - Tuesday, December 18, 2007 - link
Well, I’m glad to see this evaluation of 3-way SLI. It just gave me an idea about overcoming performance issues in games like Crysis. There is no need for building ridiculously expensive machines which draws insane amount of power. I have a better solution (although it won’t work for all of you). I’m just not going to buy a game which I can’t play in its full galore on decent system, at mainstream resolution (1680x1050).I don’t expect the latest and greatest, “show off” kind of a game to be playable at 2560x1600 with highest settings, full AA and AF. Not on a system with Q6600 and single 8800GT. But if you can’t do it on a system like one used by Anand here? Well, then it’s becoming ridiculous.
I’m trying to imagine a proud owner of machine with QX9650 @ 3.33GHz, 3 (that’s THREE) 8800 Ultras and shiny 30-inch monitor, not being able to play a game he just bought. What would be his thoughts about developer of that game? I guess not the pretty ones…