NVIDIA GeForce GTX 295: Leading the Pack
by Derek Wilson on January 12, 2009 5:15 PM EST- Posted in
- GPUs
Now that we have some hardware in our hands and NVIDIA has formally launched the GeForce GTX 295, we are very interested in putting it to the test. NVIDIA's bid to reclaim the halo is quite an interesting one. If you'll remember from our earlier article on the hardware, the GTX 295 is a dual GPU card that features two chips that combine aspects of the GTX 280 and the GTX 260. The expectation should be that this card will fall between GTX 280 SLI and GTX 260 core 216 SLI.
As for the GTX 295, the GPUs have the TPCs (shader hardware) of the GTX 280 with the memory and pixel power of the GTX 260. This hybrid design gives it lots of shader horsepower with less RAM and raw pixel pushing capability than GTX 280 SLI. This baby should perform better than GTX 260 SLI and slower than GTX 280 SLI. Here are the specs:
Our card looks the same as the one in the images provided by NVIDIA that we posted in December. It's notable that the GPUs are built at 55nm and are clocked at the speed of a GTX 260 despite having the shader power of the GTX 280 (x2).
We've also got another part coming down the pipe from NVIDIA. The GeForce GTX 285 is a 55nm part that amounts to an overclocked GTX 280. Although we don't have any in house yet, this new card was announced on the 8th and will be available for purchase on the 15th of January 2009.
There isn't much to say on the GeForce GTX 285: it is an overclocked 55nm GTX 280. The clock speeds compare as follows:
Core Clock Speed (MHz) | Shader Clock Speed (MHz) | Memory Data Rate (MHz) | |
GTX 280 | 602 | 1296 | 2214 |
GTX 285 | 648 | 1476 | 2484 |
We don't have performance data for the GTX 285 yet, but expect it (like the GTX 280 and GTX 295) to be necessary only with very large displays.
GTX 295 | GTX 285 | GTX 280 | GTX 260 Core 216 | GTX 260 | 9800 GTX+ | |
Stream Processors | 2 x 240 | 240 | 240 | 216 | 192 | 128 |
Texture Address / Filtering | 2 x 80 / 80 | 80 / 80 | 80 / 80 | 72/72 | 64 / 64 | 64 / 64 |
ROPs | 28 | 32 | 32 | 28 | 28 | 16 |
Core Clock | 576MHz | 648MHz | 602MHz | 576MHz | 576MHz | 738MHz |
Shader Clock | 1242MHz | 1476MHz | 1296MHz | 1242MHz | 1242MHz | 1836MHz |
Memory Clock | 999MHz | 1242MHz | 1107MHz | 999MHz | 999MHz | 1100MHz |
Memory Bus Width | 2 x 448-bit | 512-bit | 512-bit | 448-bit | 448-bit | 256-bit |
Frame Buffer | 2 x 896MB | 1GB | 1GB | 896MB | 896MB | 512MB |
Transistor Count | 2 x 1.4B | 1.4B | 1.4B | 1.4B | 1.4B | 754M |
Manufacturing Process | TSMC 55nm | TSMC 55nm | TSMC 65nm | TSMC 65nm | TSMC 65nm | TSMC 55nm |
Price Point | $500 | $??? | $350 - $400 | $250 - $300 | $250 - $300 | $150 - 200 |
For this article will focus heavily on the performance of the GeForce GTX 295, as we've already covered the basic architecture and specifications. We will recap them and cover the card itself on the next page, but for more detail see our initial article on the subject.
The Test
Test Setup | |
CPU | Intel Core i7-965 3.2GHz |
Motherboard | ASUS Rampage II Extreme X58 |
Video Cards | ATI Radeon HD 4870 X2 ATI Radeon HD 4870 1GB NVIDIA GeForce GTX 295 NVIDIA GeForce GTX 280 SLI NVIDIA GeForce GTX 260 SLI NVIDIA GeForce GTX 280 NVIDIA GeForce GTX 260 |
Video Drivers | Catalyst 8.12 hotfix ForceWare 181.20 |
Hard Drive | Intel X25-M 80GB SSD |
RAM | 6 x 1GB DDR3-1066 7-7-7-20 |
Operating System | Windows Vista Ultimate 64-bit SP1 |
PSU | PC Power & Cooling Turbo Cool 1200W |
100 Comments
View All Comments
strikeback03 - Thursday, January 15, 2009 - link
Just check the first page of comments.From CyberHawk:
Been waiting for this one...
... but I find a response a bit cold.
It's the fastest card for God sake!
From formulav8:
Yeps, this is one of the worst reviews Anand himself has ever done. He continues to praise nVideo who just a month or 2 ago was charging $600 for their cards.
Give credit where credit is do. He even harps on a sideport feature that doesn't mean much now and AMD says it didn't provide no real benefit even when it was enabled.
I've been a member of this site since 2000 and am dissappointed how bad the reviews here are getting especially when they have a biased tone to them.
Those are just two examples from the first page of comments on an article you yourself pointed out. Just for kicks, here is one from another article (4830 launch):
From Butterbean:
I jumped to "Final Words" and boom - no info on 4830 but right into steering people to an Nvidia card. That's so Anandtech.
I still state that no matter whose hardware they review or what they say, someone will accuse them of bias.
SiliconDoc - Tuesday, January 13, 2009 - link
AMEN. Good post.jmg873 - Tuesday, January 13, 2009 - link
you stated in the article that the gtx260 sli beating out the 4870 x2 showed that sli was superior to crossfire. i don't have an opinion at this point on which is better but saying that sli beats crossfire based on that isn't accurate. the 4870 x2 isn't 2 4870's in crossfire, it's 2 of them basically "welded" together on 1 card. if you have two 4870's in crossfire that will probably yield a different result, maybe worse, maybe better.Jovec - Tuesday, January 13, 2009 - link
My understanding is it is still Crossfire/SLI for dual gpu, single slot cards like the 295 and 4870x2. The advantage of such cards is that you don't need an CF/SLI mobo while also being a bit cheaper (than purchasing 2 of said cards). You could also go quad with these cards on a CF/SLI mobo.JarredWalton - Tuesday, January 13, 2009 - link
Just because ATI disables the "CrossFire" tab with the 4870X2 doesn't mean it isn't CrossFire. Trust me: I bought one and I'm disappointed with the lack of performance in new games on a regular basis. I'm thinking I need to start playing games around four months after they launch, just so I can be relatively sure of getting working CF support - and that's only when a game is an A-list title. There are still plenty of games where CrossFire can negatively impact performance or cause other quirkiness (Penny Arcade Adventures comes to mind immediately).af530 - Wednesday, January 14, 2009 - link
Die of aids you shitheadthevisitor - Tuesday, January 13, 2009 - link
PLEASE LEARN !The first graph in this review should be dollar per frame
but you anand cannot show it, right!
because everyone can see then how nvidia sells total crap, AGAIN.
the price is the 99% buy condition, so you must consider it when writing something again !
Kroneborge - Tuesday, January 13, 2009 - link
Different people care about different things when choosing a product. For some dollar per frame might be the most important thing, for others (especially at the high end) all they care about is having the most powerful product, and so they gladly pay a high premium for that last little extra bit.Neither is wrong, what's right for one person, can be wrong for another.
A reviewers job it to tell you about the performance, it's up to you to decide if you think that's worth the money. They can't make up your mind for you.
elerick - Tuesday, January 13, 2009 - link
Graphics are in a transition phase right now. With the economy ditching high end, they are forced to compete for midrange. That is why the competition is much more severe in the 4850/GT260 camp.It's sad that anand's readers have to blog and flame everything that is written all the time. If you leave power consumption out of the *inital* launch review it is for good reason, perhaps they are forced to get the review out or it will become yesterdays news. The most important thing here is benchmarks. Power consumption and SLI will soon be following once they can get their hands on more video cards.
I'm tired of reading comments where everyone just bitches about everything, grow up.
I look forward to reading your next review, Cheers!
araczynski - Tuesday, January 13, 2009 - link
nvidia still sucking in price as usual.