NVIDIA GeForce GTX 295 Coming in January
by Derek Wilson on December 18, 2008 9:00 AM EST- Posted in
- GPUs
While we've been holding our breath for a die-shrunk GT200, in the first announcement of something really new since the introduction of GT200, NVIDIA is promising a single card multi-GPU solution based on a GPU that falls between a GTX 260 core 216 and a GTX 280 in terms of performance. The fact that the first thing we hear after GT200 is another ultra high end launch is interesting. If the end result is pushing the GeForce GTX 260 under $200, and the GTX 280 under $300, then we can definitely get behind that: it would be sort of a midrange re-introduction by pushing current GT200 parts down in price. While we'd love to see parts from NVIDIA designed for budget minded consumers based on their new technology, the current direction does appear to be a viable alternative.
Image courtesy NVIDIA
To be fair, we don't know yet what is going to happen to GTX 260 and GTX 280 pricing. It is possible today, through combinations of instant and mail-in rebates, to find the GTX 260 for $200 and the GTX 280 for $300, but these are the exception rather than the rule. If pre-rebate pricing could fall to these levels and below, much of NVIDIA's lack in providing affordable pricing for their excellent technology will be fixed. Of course, this seems like a tough pill to swallow for NVIDIA, as the GT200 die is huge. Pricing these parts so low has to be really eating into their margins.
Image courtesy NVIDIA
And yes, this is a complete divergence from a hard launch. This announcement is antecedent to retail availability by exactly 3 weeks. Hardware will not be available until January 8th. While we are happy to talk about product whenever we are allowed, it is still our opinion that hard launches are better for everyone. Talking about something before it launches can (and has in the past with both ATI and NVIDIA) lead to changes before launch that reduce performance or completely eliminate products. Especially around holidays. This is the most tempting and worst time to announce a product without availability.
But be that as it may, we have the information and there's no reason to deny it to our avid readers just because we wish NVIDIA were acting more responsibly.
69 Comments
View All Comments
AdamK47 - Thursday, December 18, 2008 - link
The clock speeds are way lower than what I had expected, especially since this is 55nm.SuperGee - Saturday, December 20, 2008 - link
Then your take a lot factor not in to accaunt.1 ) GTX280 isn't only bigger chip due to 65nm.
But also 1,5 times more Transistors then RV770.
So it still is a bigger chip but 55nm makes it less extreem.
2 ) with that a 55nm GT200 on 280 speeds draws a lot more power to beat RV770.
3 ) while the audiencee blame GT200 65nm as a power draw king this doesn't make a RV770 a green chip. It dissipate also far over 100Watts.
4 ) 4870x2 with its 275Watt is just heater like GTX280 even more and ATI is pushin it to.
5 ) To not make a next king in power heating 300Watt is the limit.
6 ) GT200 55 nm is to power hungry to full take the potentional out of GT200. They could do GT285 speeds. but that would be in the 365Watts. 280 speeds 320Watts.
7 ) You get a ULV GT200x2 with 289Watt tad more then it direct competition and enough power to beat it with a small margin.
In sight mistake
A ) 55nm is mo miricla solution. The size of GT200 would make it a RV770 ish thing only from 40nm. GT200 on 40nm makes sense. To fill the gap to GT300 possible nextgen DX11 part. nV doesn't need a new 40nm dX10 chip. They have GT200.
B ) RV770 isn't a candidate for enviomental prices, draws still a lot of power.
From my history of 80386S 250Watt to 550Watt
So to me it's just as expected. I speculate about the name GTX265X2 But they dropped X2 for new number GTX295
SiliconDoc - Sunday, December 21, 2008 - link
But the 4870 should be compared to either 260 because that's where it's stats rest.What has already been docmunted endless times is the 4870 is 1-3 watts less than the 260 in full 3d use, whole the 260 is 30 watts less at idle.
So the 4870 is the WORSE card when it comes to power consumption.
Now if you want to compare it to the 280, why then you're comparing it to the card that beats it soundly, not just a little bit like the 260.
I saw all the carts with the 4870 supposedly beating the 260 in power consumption, because the 3d consumption was 1-3 watts less, and the 300 watt idle advantage for the 260 was "secondary".
No, doesn't make sense to me unless you're gaming 10x-30x more than in 2d or on your desktop- and then it would be a tie- but people DON'T have their cards in 3d gaming mode at that percentage of time compared to 2d or "idle".
So there was plenty of skew about there.
I don't understand how "judgement" can be so far off, except by thinking the charts I referred to are "auto generated" and use the 3d mode score ONLY for the ordering. Must be too difficult or too much of a hassle to manually change the order, so the reviewer instead of apologizing for the chart generation method just goes along and makes the twisted explanation.
Then the "fans" just blabber on repeating it.
That is exactly what I have seen.
The 260 uses less power than the 4870, and it beats it slightly overall in game benches.
Now there's the truth. That's actually exactly what all the data says. Oh well.
s d - Thursday, December 18, 2008 - link
4 simple words ...n o t ... direct ... x ... 11
:)
SuperGee - Saturday, December 20, 2008 - link
Could take a while wenn the first DX11 cards come. And must proof them selfs with first for a long time with DX10.Then DX11 sDK comes out close after the card possible But the Games take a lot longer.
It could be 6 month after DX11 runtime and hardware release that the first dx11 game drop in.
Don't expect it in 2009 more mid 2010 if dX11 is release end 2009.
till then I would enjoy games with a GTX285
michaelheath - Thursday, December 18, 2008 - link
I know the engineering process for a new card release starts a few years in advance, but what is Nvidia thinking by producing nothing but high-end cards? Sure, a dual 200-series GPU card is a prize pony you can trot out to say, "Yeah, we got the fastest, prettiest one out there." In this day and age, though, the 'halo effect' goes to the company who can produce a high-performance product without associating a high-performance cost-to-own or cost-to-run.Nvidia needs to fill the void left in the 200-series' wake. Price-conscious shoppers might go for the 9x00 cards, but the tech-savvy price-conscious buyers know well enough that the 9x00 cards are nothing but re-named or, at best, die-shrinks of 2 year old technology. ATI, on the other hand, has a full range of cards in the 4xx0-series that is new(er) technology, covers a fuller spectrum of price ($50-300 before spiking to $500 for the X2 cards, and $100-200 buys you very respectable performance), and the new generation consistently out-paces the last-gen products.
Now if ATI's driver team would spend more time on Q/A and fix that shadow bug in Left 4 Dead, I'd dump my 8800 GT 512 in a heart beat for a Radeon 4870 1GB. The only item that would keep me with the green team is if they die-shrink the GTX260, bump the clock speeds considerably, and put a MSRP of $200 on it.
SiliconDoc - Sunday, December 28, 2008 - link
ps - Maybe you should have just been HONEST and come out with it." My 880GT is hanging really tough, it's still plenty good enough to live with, and going with the same card company won't be a different experience, so the 4870 1 gig looks real good but it's $300.00 bucks and that's a lot of money when it isn't better than the 260.
Why can't NV make a 4870 1 gig killer FOR ME, that makes it worth replacing my HIGH VALUE 880GT that's STILL hanging tough for like $200 ?
_________________________________________________________________
Yeah, good luck. With all the cards one could still say it's "moving rather slowly" - since years older tech (8800 series)still runs most monitors (pushing it at 1600x****) , and plays games pretty well.
You're sitting in a tough spot, the fact is your card has had a lot of lasting value.
SiliconDoc - Sunday, December 28, 2008 - link
Oh wait a minute, I guess I misinterptreted. ATI took their single 4000 series chips, and went wackado on it - and made a whole range of cards, and NVidia took their single 80series core- and went wackado on it - and made a whole range of cards - oh but NV even went a step further and reworked the dies and came up with not just G80 but G82,G84,G90,G92...Umm, yeah did ATI do that ? YES ?
_________________________________________
I REST MY CASE !
SiliconDoc - Sunday, December 28, 2008 - link
So the cards - NV, all the way down to the 8400GS for 20 bucks at the egg... just don't exist according to you.I guess NV should take another corporate board/ceo wannabe's net advice and remake all their chips that already match the knockdown versions of the ATI latest great single chip that can't even match the NV one.
For instance, the 4850 is compared to the 9800GT, the 9800gtx and 9800GTX+ . Why can't ATI make a chip better that NVidia's several gens older chips ? Maybe that's what you should ask yourself. It would be a lot more accurate than what you said.
Let me know about these cards "filling the lower tiers".
GeForce 9400 GT
GeForce 9500 GT
GeForce 9600 GT
GeForce 9600 GSO
GeForce 9800 GT
GeForce 9800 GT
GPU
GeForce 8400 GS
GeForce 8500 GT
GeForce 8600 GT
GeForce 8600 GT
GeForce 8800 GT
GeForce 8800 GT
______________________________-
Those don't exist, right ? The reason NV doesn't make a whole new chip lineup just to please you, you know, one that is "brand new" like the 6mo. or year old 4000 series now, is BECAUSE THEY ALREADY HAVE A WHOLE SET OF THEM.
_________________________________
Whatever, another deranged red bloviator on wackoids.
sandman74 - Thursday, December 18, 2008 - link
There is hard performance data on this card at www.bit-tech.net
Basically it performs like a 280 in SLI in most cases (or thereabouts) which is pretty good, and does indeed beat the 4870 X2.
I doubt this card is for me though. Too expensive, too hot to run, too much power. Im hoping the GTX 285 will more be appropriate for me.