GeForce 3D Vision: Stereoscopic 3D From NVIDIA
by Derek Wilson on January 8, 2009 2:30 PM EST- Posted in
- GPUs
Final Words
All in all, this is like a very polished version of what we've had since the turn of the century. No flicker, less headaches (though there may still be some issue with people who have motion sickness -- I just don't have a large enough sample size to say definitively), and broad game support with less of a performance hit than other solutions. NVIDIA has a very good active shutter stereoscopic solution with GeForce 3D Vision. But the problem is that its value is still very dependent on the application(s) the end user wants it for. It works absolutely perfectly for viewing stereo images and 3D movies (which might be more of a factor when those start coming to bluray) and applications built with stereo support. But for games, though it works with 350 titles, it's just a little hit or miss.
We really hate to say that because we love seeing anyone push through the chicken and egg problem. NVIDIA getting this technology out there and getting developers excited about it and publishers excited about a new market will ultimately really make this a reality. But until most devs program in ways that are friendly to NVIDIA's version of stereo rendering, the gaming experience will be either good or not so good and there's just no way of knowing how much each individual title's problems will bother you until you try it. And at $200 that's a bit of a plunge for the risk. Especially if you don't have a 120Hz display device (which will cost several hundred more).
If you absolutely love a few of the games that works great with it, then it will be worth it. The problem is that NVIDIA's rating make is so that you can't rely on "excellent" as being excellent. Most of the people who played with Left 4 Dead loved it, but one person was really bothered by the floating names being 2D sprites at screen depth. Which is annoying, but the rest of it looked good enough for me not to care (and I'm pretty picky). If NVIDIA wants to play fast and loose with it's ratings, thats fine, but we don't have time to tests all their games and confirm their rating or come up with our own. They really should at least have another class of rating called "perfect" where there are absolutely no issues and all settings work great and we get exactly what we expect.
Shutter glasses have been around for a long time. Perhaps now the time is right for them to start pushing into the mainstream. But NVIDIA isn't doing the technology any favors if they put something out there and let it fail. This technology needs to be developed and needs to be pervasive because it is just that cool. But until it works perfectly in a multitude of games or until 3D movies start hitting PCs near you, we have the potential for a set back. If GeForce 3D Vision is successful, however, that will open the door for us to really move forward with stereoscopic effects.
What we really need, rather than a proprietary solution, is something like stereoscopic support built in to DirectX and OpenGL that developers can tap into very easily. Relying on NVIDIA to discern the proper information and then handle rendering images for both eyes off of one scene is great as a stop gap, just like CUDA was a good interim solution before we had OpenCL. We need the API to be able to handle knowing if there is stereo hardware present and making it easy to generate images for both eyes while duplicating as little work as possible. Giving developers simple tools to make stereo effects cooler and more real or to embed hits about convergence and separation would be great as well.
And hopefully GeForce 3D Vision is a real step toward that future that can become viable right now. I could see some World of Warcraft devotees being really excited about it. Those out there like me who love 3D technology in every form will be excited by it. People who want to create there own stereo images or videos (there are lenses available for this and techniques you can improvise to make it work) will like it, but people waiting for 3D movies will need some content available at home first. But the guys who we would love to see drive the adoption of the technology might not be as into it. The hardcore gamers out there looking to upgrade will probably be better served at this point by going with a high end graphics card and a 30" display rather than a 120Hz monitor and shutter glasses.
54 Comments
View All Comments
jkostans - Friday, January 9, 2009 - link
So how is this different from my ELSA 3d shutter glasses from 1999? The glasses I paid $50 for back then are just as good as this $200 setup in 2009? Great job re-inventing the wheel and charging more for it nVIDIA.There is a reson shutter glasses didn't catch on. Ghosting being the worst problem, along with compatibility, loss of brightness/color accuracy, performance hits, the need for high refresh rate, etc etc etc.
If you are thinking of buying these, don't. You will use them for a few weeks, then just toss them in a drawer due to lack of game support and super annoying ghosting.
nubie - Friday, January 9, 2009 - link
It is different because these are likely ~$400 - $500 quality glasses.Check out my setup with high resolution, no ghosting, high compatibility, minimal performance hit:
http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...
http://picasaweb.google.com/nubie07/3DMonitor">http://picasaweb.google.com/nubie07/3DMonitor
Running on iZ3D of course, no need for nVidia at all, buy any card you like, and keep running XP until Microsoft releases another OS worth spending money for.
jkostans - Friday, January 9, 2009 - link
No ghosting?http://picasaweb.google.com/nubie07/3DMonitor#5060...">http://picasaweb.google.com/nubie07/3DMonitor#5060...
I can see it there and thats not even a high contrast situation.
Shutter glasses are shutter glasses, they all suck regardless of price.
nubie - Saturday, January 10, 2009 - link
OK have a closed mind, technology never advances.PS, that picture was taken through a linear polarized lens, and I am holding the camera and the glasses, so they may not have been lined up.
Also the contrast is automatically set by the camera, in person there isn't any ghosting.
Shadowdancer2009 - Friday, January 9, 2009 - link
Can they PLEASE kill this tech soon?It was 100% crap the first time, and it won't get better no matter how awesome the drivers are.
The glasses eat 50% of the brightness when "open" and doesn't kill 100% when "closed"
They never did, and your review says the same thing.
This was crap ten years ago, and it's crap now.
Give us dual screen highres VR goggles instead.
nubie - Friday, January 9, 2009 - link
Maybe you don't understand the technology, these are ~$400 - $500 glasses, wireless with about a week of li-ion battery power.Don't compare them to the $10 ones you can get anywhere, at least try them for yourself.
There are much better reasons to bash nVidia, like dropping support for 90% of the displays they used to support, and making support Vista only.
gehav - Friday, January 9, 2009 - link
I'm perfectly satisfied with the current refresh rate of LCD-panels (60Hz). However what you forgot is the following: if the 3D glasses open and shut 60 times per second (for a 120Hz Panel) the old flicker of CRTs is effectively back. Therefore raising the refresh rate of the monitor to 240Hz would reduce the per eye flicker to an acceptable 120Hz. Not the monitor itself is the culprit here but the 3D glasses reintroduce flickering like in the old days of CRTs (and they are directly dependent on the refresh rate of the monitor).Georg
gehav - Friday, January 9, 2009 - link
btw: 200Hz displays are already on the way, it seems:http://www.engadget.com/2008/09/02/sony-samsung-bo...">http://www.engadget.com/2008/09/02/sony...-both-cl...
gehav - Friday, January 9, 2009 - link
Just a thought I had while reading the article:Wouldn't a ray traced image work far better for stereoscopic viewing? From what I understand the rasterizing technique used by today's graphics cards uses all kinds of tricks and effects to create the perception of a "real 3D world". That's why the drivers have to be customized for every game.
Ray tracing uses a far simpler algorithm to get good results. Every light ray is calculated separately and every game that uses ray tracing should therefore - in principle - easily be customizable for stereoscopic viewing.
I'm thinking of the announced Intel Larrabee which will maybe offer ray tracing acceleration for games and could therefore be much better suited for stereoscopic viewing.
Not sure if I'm right with these thoughts but it would be interesting to see if games that are already available in a ray tracing version (like Quake 4) could be easily adapted to support stereoscopic viewing and what the result would look like.
Apart from that I also think we would need faster LCD-panels (240Hz) to get non-flickering pictures for each eye.
Georg
nubie - Friday, January 9, 2009 - link
Check out some of the other initiatives, notably iZ3D, who have offered a free driver for all AMD products and XP support (double check the nVidia support for XP, non-existent much?)nVidia's idea is too little, too expensive, too late. I have built my own dual-polarized passive rig that works great with $3 glasses, unfortunately nVidia has dropped all support (the last supported card is from the 7 series, so "gaming" isn't really an option.)
Thankfully iZ3D has stepped up to provide drivers, but thanks to nVidia's lack of support I have lost so much money on unsupported 8 series hardware that I haven't looked at a game in a couple years.
nVidia has killed my will to game. Dropping support of 3D is not the way to sell 3D (do some research, nvidia has dropped XP, supports only vista, and not even any of the cool displays you can cobble together yourself for less than the $200 this stupid package costs.)
My proof of concept, before nvidia pulled the plug:
http://picasaweb.google.com/nubie07/3DMonitor#">http://picasaweb.google.com/nubie07/3DMonitor#
My gaming rig, before nvidia dropped support for ~3 years:
http://picasaweb.google.com/nubie07/StereoMonitorS...">http://picasaweb.google.com/nubie07/StereoMonitorS...
nVidia needs to do better than this, and they should know better.