3D Vision Surround: NVIDIA’s Eyefinity
During our meeting with NVIDIA, they were also showing off 3D Vision Surround, which was announced at the start of CES at their press conference. 3D Vision Surround is not inherently a GF100 technology, but since it’s being timed for release along-side GF100 cards, we’re going to take a moment to discuss it.
If you’ve seen Matrox’s TripleHead2Go or AMD’s Eyefinity in action, then you know what 3D Vision Surround is. It’s NVIDIA’s implementation of the single large surface concept so that games (and anything else for that matter) can span multiple monitors. With it, gamers can get a more immersive view by being able to surround themselves with monitors so that the game world is projected from more than just a single point in front of them.
NVIDIA tells us that they’ve been sitting on this technology for quite some time but never saw a market for it. With the release of TripleHead2Go and Eyefinity it became apparent to them that this was no longer the case, and they unboxed the technology. Whether this is true or a sudden reaction to Eyefinity is immaterial at the moment, as it’s coming regardless.
This triple-display technology will have two names. When it’s used on its own, NVIDIA is calling it NVIDIA Surround. When it’s used in conjunction with 3D Vision, it’s called 3D Vision Surround. Obviously NVIDIA would like you to use it with 3D Vision to get the full effect (and to require a more powerful GPU) but 3D Vision is by no means required to use it. It is however the key differentiator from AMD, at least until AMD’s own 3D efforts get off the ground.
Regardless of to what degree this is a sudden reaction from NVIDIA over Eyefinity, ultimately this is something that was added late in to the design process. Unlike AMD who designed the Evergreen family around it from the start, NVIDA did not, and as a result they did not give a GF100 the ability to drive more than 2 displays at once. The shipping GF100 cards will have the traditional 2 monitor limit, meaning that gamers will need 2 GF100 cards in SLI to drive 3+ monitors, with the second card needed to provide the 3rd and 4th display outputs. We expect that the next NVIDIA design will include the ability to drive 3+ monitors from a single GPU, as for the moment this limitation precludes any ability to do Surround for cheap.
GTX 280 with 2 display outputs: GF100 won't be any different
As for some good news, as we stated earlier this is not a technology inherent to the GF100. NVIDIA can do it entirely in software and as a result will be backporting this technology to the GT200 (GTX 200 series). The drivers that get released for the GF100 will allow GTX 200 cards to do Surround in the same manner: with 2 cards, you can run a single large surface across 3+ displays. We’ve seen this in action and it works, as NVIDIA was demoing a pair of GTX 285s running in NVIDIA Surround mode in their CES booth.
The big question of course is going to be what this does for performance on both the GF100 and GT200, along with compatibility. That’s something that we’re going to have to wait on the actual hardware for.
115 Comments
View All Comments
DanNeely - Monday, January 18, 2010 - link
For the benefit of myself and everyone else who doesn't follow gaming politics closely, what is "the infamous Batman: Arkham Asylum anti-aliasing situation"?sc3252 - Monday, January 18, 2010 - link
Nvidia helped get AA working in batman which also works on ATI cards. If the game detects anything besides a Nvidia card it disables AA. The reason some people are angry is when ATI helps out with games it doesn't limit who can use the feature, at least that's what they(AMD) claim.san1s - Monday, January 18, 2010 - link
the problem was that nvidia did not do qa testing on ati hardwareMeghan54 - Monday, January 18, 2010 - link
And nvidia shouldn't have since nvidia didn't develop the game.On the other hand, you can be quite certain that the devs. did run the game on Ati hardware but only lock out the "preferred" AA design because of nvidia's money nvidia invested in the game.
And that can be plainly seen by the fact that when the game is "hacked" to trick the game into seeing an nvidia card installed despite the fact an Ati card is being used and AA works flawlessly....and the ATi cards end up faster than current nvidia cards....the game is exposed for what it is. Purposely crippling a game to favor one brand of video card over another.
But the nvididiots seem to not mind this at all. Yet, this is akin to Intel writing their complier to make AMD cpus run slower or worse on programs compiled with the Intel compiler.
Read about that debacle Intel's now suffering from and that the outrage is fairly universal. Now, you'd think nvidia would suffer the same nearly universal outrage for intentionally crippling a game's function to favor one brand of card over another, yet nvidiots make apologies and say "Ati cards weren't tested." I'd like to see that as a fact instead of conjecture.
So, one company cripples the function of another company's product and the world's up in arms, screaming "Monopolistic tactics!!!" and "Fine them to hell and back!"; another company does essentially the same thing and it gets a pass.
Talk about bias.
Stas - Tuesday, January 19, 2010 - link
If nV continues like this, it will turn around on them. It took MANY years for the market guards to finally say, "Intel, quit your sh*t!" and actually do something about it. Don't expect immediate retaliation in a multibillion dollar world-wide industry.san1s - Monday, January 18, 2010 - link
"yet nvidiots make apologies and say "Ati cards weren't tested." I'd like to see that as a fact instead of conjecture. "here you go
http://www.legitreviews.com/news/6570/">http://www.legitreviews.com/news/6570/
"On the other hand, you can be quite certain that the devs. did run the game on Ati hardware but only lock out the "preferred" AA design because of nvidia's money nvidia invested in the game. "
proof? that looks like conjecture to me. Nvidia says otherwise.
Amd doesn't deny it either.
http://www.bit-tech.net/bits/interviews/2010/01/06...">http://www.bit-tech.net/bits/interviews...iew-amd-...
they just don't like it
And please refrain from calling people names such as "nvidiot," it doesn't help portray your image as unbiased.
MadMan007 - Monday, January 18, 2010 - link
Oh for gosh sakes, this is the 'launch' and we can't even have a paper launch where at least reviewers get hardware? This is just more details for the same crap that was 'announced' when the 5800s came out. Poor show NV, poor show.bigboxes - Monday, January 18, 2010 - link
This is as close to a paper launch as I've seen in a while, except that there is not even an unattainable card. Gawd, they are gonna drag this out a lonnnnngg time. Better start saving up for that 1500W psu!Adul - Monday, January 18, 2010 - link
I suppose this is a vaporlaunch then.Adul - Monday, January 18, 2010 - link
I suppose this is a vaporlaunch then.