NVIDIA GeForce 7800 GT: Rounding Out The High End
by Derek Wilson & Josh Venning on August 11, 2005 12:15 PM EST- Posted in
- GPUs
Power Consumption
With fewer pipelines and lower clock speeds, the 7800 GT should require less power to operate. While that may not matter much for people with high-end desktop systems, there are many areas that would benefit from a cooler running GPU. Small form factor cases and laptops are two such areas, so let's see how the 7800 GT compares to the other cards.Both Idle and Load power graphs clearly show that the 7800 GT is a superior performer. This card is much less power hungry than the 6800 Ultra and performs better across the board. These characteristics lend themselves very well to applications like mobile computing. With lower power consumption comes longer battery life, less heat, and more performance for portable computing.
The current trend in notebooks is to give as big a slice of the TDP as possible to the graphics card. A low power CPU is used in the system and terrific gaming performance is still available. The 2GHz and up Pentium M CPUs are quite capable of providing good gaming performance. If NVIDIA lowers the power needs on its mobile parts significantly, OEMs can afford to use more power for the CPU or cheaper (and lighter) cooling solutions. This is good news for anyone in the market for any type of mobile computer. Here's to hoping the GeForce Go 7 Series makes its debut sooner rather than later.
77 Comments
View All Comments
dwalton - Friday, August 12, 2005 - link
"I would like G70 technology on 90nm ASAP, I have a feeling Nvidia didn't do a shift to 90nm for NV40 for a reason, as that core is still based on AGP technology, and Nvidia currently doesn't have a native PCI-E part for 6800 Line, they are all using HSI on the GPU substrate from the NV45 design."I believe Nvidia didn't want another 5800 fiasco. They probably determine a long time ago that 110 nm was a safer bet and used the 6600 as a guinea pig. Having a sucessful launch of the 6600 gave them confidence that manufacturing a 110nm g70 would be painless process.
Futhermore, the 7600 will be a midrange card and will target a market segment that is more than likely dominated by AGP boards. So NV40 based 7600 would make perfect sense since the majority of the 7600 sold wouldn't require a HSI chip.
"Let's faice it for the time being, were not going to be getting fully fucntional high end cores at the 199US price point with 256Bit Memory Interface, so far we have gotten things like Radeon X800, Geforce 6800, 6800 LE, X800 SE, X800 GT. Etc etc. It just doesn't seem profitable to do so."
The X800 GT is a 256Bit Memory Interface Card and targets the 6600 GT segment.
coldpower27 - Friday, August 12, 2005 - link
"The X800 GT is a 256Bit Memory Interface Card and targets the 6600 GT segment."I guess you missed reading the fully functional part, as the X800 GT does not comply with this statement.
I guess I didn't get my meaning right, when I said G70 technology, I was talking about the mainstream cards going to 90nm not the 7800 GTX/GT.
For a mid range part the risk would be reduced for going to 90nm as the core is not quite as complex, Nvidia did make safe bet to go to 110nm for their high end cards, I am asking for a G7x technology based performace (199US) card on 90nm technology. Not on the high end.
Targeting PCI-E now would be a good idea as there are now boards on both sides that have PCI-E support for a decent amount of time, and it's the more forward thinking marchitecture, not to mention the possibility of power conusmption reduced enough on the 7600 GT to put it solely on the PCI-E bus if the Bridge Chip didn't exist. There isn't much point in designing a native AGP chip now, unless your talking about the value segment where margins are extremely thin per card.
For the AGP users, I believe they can continue to use 110nm NV48, but I would like for PCI-E users to benefit from a 7600 GT 90nm PCI-E native card, with possible bridging to AGP if demand calls for it. There isn't much point of calling the mianstream card a 7600 GT if it's not based on G7x technology. We don't want Nvidia to follow ATI's lead on that kinda front. :)
neogodless - Thursday, August 11, 2005 - link
I mainly agree with you, and who knows, such things could be in the work. But "simple process shrink"? I get the feeling that's a contradiction!Let us not forget the mistakes of the past... like the FX5800 and its "simple process shrink".
dwalton - Thursday, August 11, 2005 - link
The FX5800 was Nvidia attempt to introduce a new high end architecture on a new process (130nm) it had never used before. Just like ATI is doing now. The 6xxx lines (130nm) is not new tech so producing a mature NV40 architecture on 90nm or 110nm should go alot smoother. Even at 130nm the NV40 is smaller than the G70 at 110nm (287mm² vs. 334mm²). Moving the NV40 to 90nm would reduce die size to ~200mm². Look at the 6600Gt: 110nm, 150mm², 8 pipes(?) vs. a 90nm NV40 ~200mm², 16 pipes.JarredWalton - Friday, August 12, 2005 - link
Unless they can make a 90nm "7600GT" part backwards compatible (via SLI) with the 6800GT, NVIDIA is in a position of "damned if you do, damned if you don't." As a 6800GT owner, I'd be rather sad to suddenly have the promise of upgrading to SLI yanked away.dwalton - Thursday, August 11, 2005 - link
Also, mature driver support at introduction.Sunbird - Thursday, August 11, 2005 - link
I don't know. At least they arent doing those ugly case reviews anymore, but they sure are still making me feel alienated.Thats first page smacks of elitism. Why can't we average people with a 5900XT (or even 5200) upgrade to say a 7600 that uses less power and thus is less noisy and easier to cool to than a 6600 or 6800?
I wonder which of the 2 authors wrote that paragraph?
I guess this could be a symptom of Anand and his Apple usage, cause Apple people are often very elitist. Or it could be that they want to be the upmarket tech website for people with lots of money and think Toms Hardware is better suited to us unwashed (FX 5200 weilding) masses.
Actually, this whole new colourscheme smacks of cold sauve elitism! Not the warm yellowish homey feel of old....
;(
Shinei - Thursday, August 11, 2005 - link
As has been pointed out in this very comments section, a 7600 release would be redundant because there already is a 16-pipe, 350MHz part with 6 vertex pipelines: The 6800GT. There is no elitism, it's the raw fact that a 7600GT would be identical to a 6800GT in specifications and (most likely) performance, rendering it pointless to spend time fabricating when the 6800GT serves just as well.As for the article, I noticed that the 7800GT was outperformed by the 6800U in some SLI applications (like UT2004). Is that related to memory bandwidth, or is that a driver issue because of the 77.77 beta drivers you tested with?
Sunbird - Friday, August 12, 2005 - link
As has been pointed out in my very own comment, "upgrade to say a 7600 that uses less power and thus is less noisy and easier to cool to than a 6600 or 6800?"And anyway its about price point, will the 6800GT cost as much as the 6600GT then?
I want 6800GT (aka 7600) performance at the $200 to $250 pricepoint and I want it now!
I'd settle for a 7200 that performs like a 6600 too.
DerekWilson - Friday, August 12, 2005 - link
Yes, the 6800 GT will come down to $250 and likely even more over the next few months. You can already buy a 6800 GT for $270 (from our realtime price engine).The 6800 GT is not a noisy part. The HSF solution for the 7800 GT is strikingly similar. A lower performance G70 part may run cooler and draw less power, but, again, the 6800 GT is not a power hog.
There really is not a reason for us to want a lower performing G70 part -- prices on 6 Series cards are falling and this is all we need. Even if NVIDIA came out with something like a "7200 that performs like a 6600", the 6600 would probably be cheaper because people would the 7 means more performance -- meaning the 6600 would be a better buy.