GeForce 3D Vision: Stereoscopic 3D From NVIDIA
by Derek Wilson on January 8, 2009 2:30 PM EST- Posted in
- GPUs
As we've seen over the past few years, NVIDIA isn't content with simply doing what has been done well. Certainly their graphics cards are good at what they do and competition in the market is great today delivering amazing value to consumers. But they've forged ahead with initiatives like SLI for multi-GPU rendering and CUDA for general purpose programing on GPU. Now they're taking it a step further and getting into stereoscopic 3D.
To be fair, NVIDIA has supported stereoscopic 3D for a long time, but this is more of a push to get pervasive stereoscopic graphics into the consumer space. Not only will NVIDIA graphics cards support stereoscopic rendering, they will also be enhancing their driver to extract depth information and create left and right eye images for applications that do not natively produce or support stereo rendering. And did we mention they'll also be selling active wireless shutter glasses?
Packaged as GeForce 3D Vision, NVIDIA's shutter glasses and transmitter pair will run consumers a respectable $200. This is more expensive than some glasses and cheaper than others. We actually don't have any other glasses in house to compare them to, but the quality, freedom and battery life are quite good. If it becomes necessary, we will do a comparison with other products, but the real advantage isn't really in the hardware; it's in the driver. The package also comes with a soft bag and cloth for the glasses, alternate nose pieces, cables and converters, and a couple disks with drivers, stereoscopic photo viewer and video player.
Stereoscopic 3D shutter glasses have been around since the late 90s, but with the push away from CRTs to LCDs with a fixed 60Hz refresh rate meant that high quality stereoscopic viewing on the desktop had to be put on hold (along with hopes for smaller pixels sizes, but that's a whole other rant). With Hollywood getting really interested in 3D movies and some display manufacturers getting on board with 120Hz monitors, TVs and projectors, it makes sense that we would see someone try to push this back to the forefront.
Before we get into just how NVIDIA wants to make stereoscopic 3D on the desktop a reality, lets take a look at exactly what we're talking about.
54 Comments
View All Comments
Matt Campbell - Thursday, January 8, 2009 - link
One of my roommates in college had a VR helmet he used to play Descent, and was interning at a company designing (then) state-of-the-art updates to it. It was pretty wild to try, and hysterical to watch the person in the chair dodging and moving as things flew at them. It was really dodgy on support though, and gave most of us a headache after about 10 minutes. Now it's over 10 years later, and it doesn't sound like much has changed.crimson117 - Thursday, January 8, 2009 - link
VR helmets were more about making your real head's position guide your avatar's head's position than about providing stereoscopic 3D.Holly - Thursday, January 8, 2009 - link
They did the both. It had tiny screen for each eye..... reminds me lovely days of System Shock :'(
Dfere - Thursday, January 8, 2009 - link
So. Mediocre equipment with mediocre drivers. Gee, why would anyone want us to buy it?Am I the only one getting a feeling this is a start of something designed to suck up more GPU power and/or sell SLI as a mainstream requirement? After all, resolutions and FPS increases can't alone fuel the growth Nvidia and ATI would like.
PrinceGaz - Thursday, January 8, 2009 - link
I think you are being hopelessly negative about why nVidia would be doing this.What advantage do they gain by a move towards stereoscopic 3D glasses? Okay, increased 3D rendering power is needed as each frame has to be rendered twice to maintain the same framerate, but GPU power is increasing so quickly that is almost a non-issue, so SLI is irrelevant... NOT.
The main problem with stereoscopic rendering is each consecutive frame has to be rendered from a different perspective, and only every second frame is directly related to the one before it. That seems to be so nicely connected to what SLI AFR mode provides that it is too good to be true. One card does the left-eye in SLI AFR, the other the right-eye, and with suitably designed drivers, you get all the normal effects which rely on access to the previous frame (motion-blur etc) but in a "3D graphics system" sell twice as many cards as one card is doing each eye. They're not daft-- stereoscopic display is going to make dual GPU cards not just a luxury for the high-end gamer, but a necessity for normal gamers who want a satisfactory 3D experience.
Gannon - Thursday, January 8, 2009 - link
... for nvidia to co-operate with monitor manufacturers and implement 3D in the display itself instead of these half-baked attempts at depth. Nobody really wants to wear special glasses so they can have 3D depth perception on their computer.The only way you are going to standardize something like this (because people are lazy and ignorant, lets face it) is to do it at the point where everybody gets it so it is standardized - everyone needs a monitor with their computer, so it would make sense to work towards displays that either:
1) Are natively 3D or
2) Built the costly stereoscopy into the monitor itself, thereby reducing costs through economies of scale.
I really think current shutter based stereoscopic 3D is a hold-over until we start to get real 3D displays. If I was nvidia I'd want to do it on the monitor end, not as an after-market part targetted towards gamers at a $200 price point.
nubie - Friday, January 9, 2009 - link
Try Passive glasses, weight is next to nothing, no moving parts, no batteries.Just polarization that works off of native LCD tech:
http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...
nVidia dropped support for this, boo/hiss.
rcr - Thursday, January 8, 2009 - link
Is there the possibility to just use an SLI-system to get rid of these problems about the visual quality. So would it be possible to let every Graphiccard do the calculations for every eye and so you could the same Quality as on one card?wh3resmycar - Thursday, January 8, 2009 - link
what do you guys think? how about ViiBii?
JonnyDough - Thursday, January 8, 2009 - link
No, actually the picture says "AT" just in case anyone couldn't see it. :-)