NVIDIA’s GeForce GTX 480 and GTX 470: 6 Months Late, Was It Worth the Wait?
by Ryan Smith on March 26, 2010 7:00 PM EST- Posted in
- GPUs
Temperature, Power, & Noise: Hot and Loud, but Not in the Good Way
For all of the gaming and compute performance data we have seen so far, we’ve only seen half of the story. With a 500mm2+ die and a TDP over 200W, there’s a second story to be told about the power, temperature, and noise characteristics of the GTX 400 series.
Starting with idle temperatures, we can quickly see some distinct events among our cards. The top of the chart is occupied solely by AMD’s Radeon 5000 series, whose small die and low idle power usage let these cards idle at very cool temperatures. It’s not until half-way down the chart that we find our first GTX 400 card, with the 470 at 46C. Truth be told we were expecting something a bit better out of it given that its 33W idle is only a few watts over the 5870 and has a fairly large cooler to work with. Farther down the chart is the GTX 480, which is in the over-50 club at 51C idle. This is where NVIDIA has to pay the piper on their die size – even the amazingly low idle clockspeed of 50MHz/core 101MHz/shader 67.5Mhz/RAM isn't enough to drop it any further.
For our load temperatures, we have gone ahead and added Crysis to our temperature testing so that we can see both the worst-case temperatures of FurMark and a more normal gameplay temperature.
At this point the GTX 400 series is in a pretty exclusive club of hot cards – under Crysis the only other single-GPU card above 90C is the 3870, and the GTX 480 SLI is the hottest of any configuration we have tested. Even the dual-GPU cards don’t get quite this hot. In fact it’s quite interesting that unlike FurMark there’s quite a larger spread among card temperatures here, which only makes the GTX 400 series stand out more.
While we’re on the subject of temperatures, we should note that NVIDIA has changed the fan ramp-up behavior from the GTX 200 series. Rather than reacting immediately, the GTX 400 series fans have a ramp-up delay of a few seconds when responding to high temperatures, meaning you’ll actually see those cards get hotter than our sustained temperatures. This won’t have any significant impact on the card, but if you’re like us your eyes will pop out of your head at least once when you see a GTX 480 hitting 98C on FurMark.
Up next is power consumption. As we’ve already discussed, the GTX 480 and GTX 470 have an idle power consumption of 47W and 33W respectively, putting them out of the running for the least power hungry of the high-end cards. Furthermore the 1200W PSU we switched to for this review has driven up our idle power load a bit, which serves to suppress some of the differences in idle power draw between cards.
With that said the GTX 200 series either does decently or poorly, depending on your point of view. The GTX 480 is below our poorly-idling Radeon 4000 series cards, but well above the 5000 series. Meanwhile the GTX 470 is in the middle of the pack, sharing space with most of the GTX 200 series. The lone outlier here is the GTX 480 SLI. AMD’s power saving mode for Crossfire cards means that the GTX 480 SLI is all alone at a total power draw of 260W when idle.
For load power we have Crysis and FurMark, the results of which are quite interesting. Under Crysis not only is the GTX 480 SLI the most demanding card setup as we would expect, but the GTX 480 itself isn’t too far behind. As a single-GPU card it pulls in more power than either the GTX 295 or the Radeon 5970, both of which are dual-GPU cards. Farther up the chart is the GTX 470, which is the 2nd most power draining of our single-GPU cards.
Under FurMark our results change ever so slightly. The GTX 480 manages to get under the GTX 295, while the GTX 470 falls in the middle of the GTX 200 series pack. A special mention goes out to the GTX 480 SLI here, which at 851W under load is the greatest power draw we have ever seen for a pair of GPUs.
Idle noise doesn’t contain any particular surprises since virtually every card can reduce its fan speed to near-silent levels and still stay cool enough. The GTX 400 series is within a few dB of our noise floor here.
Hot, power hungry things are often loud things, and there are no disappointments here. At 70dB the GTX 480 SLI is the loudest card configuration we have ever tested, while at 64.1dB the GTX 480 is the loudest single-GPU card, beating out even our unreasonably loud 4890. Meanwhile the GTX 470 is in the middle of the pack at 61.5dB, coming in amidst some of our louder single-GPU cards and our dual-GPU cards.
Finally, with this data in hand we went to NVIDIA to ask about the longevity of their cards at these temperatures, as seeing the GTX 480 hitting 94C sustained in a game left us worried. In response NVIDIA told us that they have done significant testing of the cards at high temperatures to validate their longevity, and their models predict a lifetime of years even at temperatures approaching 105C (the throttle point for GF100). Furthermore as they note they have shipped other cards that run roughly this hot such as the GTX 295, and those cards have held up just fine.
At this point we don’t have any reason to doubt NVIDIA’s word on this matter, but with that said this wouldn’t discourage us from taking the appropriate precautions. Heat does impact longevity to some degree – we would strongly consider getting a lifetime warranty for the GTX 480 to hedge our bets.
196 Comments
View All Comments
Headfoot - Monday, March 29, 2010 - link
Great review, great depth but not too long. Concise but still enough information.THANK YOU SO MUCH FOR INCLUDING MINIMUM FRAME RATES!!! IMO they contribute the most to a game feeling "smooth"
niceboy60 - Friday, August 20, 2010 - link
This review is not accurate , Badaboom GTX 400 series cards , are not compatible with GTX 400 series yet .However they already post the test resaultsI have a GTX 480 and does not work with badaboom , Badaboom official site confirms that
slickr - Sunday, March 28, 2010 - link
I thought that after the line-up of games thread, you would really start testing games from all genres, so we can actually see how each graphic cards performs in different scenarios.Now you have 80% first person shooters, 10% racing/Action-adventure and 10%RPG and RTS.
Where are the RTS games, isometric RPG's, simulation games, etc?
I would really like Battleforge thrown out and replaced by Starcraft 2, DOW 2: Chaos Rising, Napoleon Total War. All these RTS games play differently and will give different results, and thus better knowledge of how graphic cards perform.
How about also testing The Sims 3, NFS:Shift, Dragon Age Origins.
Ryan Smith - Monday, March 29, 2010 - link
Actually DAO was in the original test suite I wanted to use. At the high end it's not GPU limited, not in the slightest. Just about everything was getting over 100fps, at which point it isn't telling us anything useful.The Sims 3 and Starcraft are much the same way.
Hsuku - Sunday, March 28, 2010 - link
On Page 9 of Crysis, your final sentence indicates that SLI scales better than CF at lower resolutions, which is incorrect from your own graphs. CF clearly scales better at lower resolutions when video RAM is not filled:@ 1680x1050
480 SLI -- 60.2:40.7 --> 1.48
5870 CF -- 53.8:30.5 --> 1.76 (higher is better)
@ 1920x1200
480 SLI -- 54.5:33.4 --> 1.63
5870 CF -- 46.8:25.0 --> 1.87 (higher is better)
This indicates the CF technology scales better than SLI, even if the brute performance of the nVidia solution comes out on top. This opposes diametrically your conclusion to page 9 ("Even at lower resolutions SLI seems to be scaling better than CF").
(Scaling ability is a comparison of ratios, not a comparison of FPS)
Ryan Smith - Monday, March 29, 2010 - link
You're looking at the minimums, not the averages.Hsuku - Tuesday, March 30, 2010 - link
My apologies, I was looking at the wrong graphs.However, even so, your assertion is still incorrect: at at the lowest listed resolution, CF and SLI scaling are tied.
Ryan Smith - Wednesday, March 31, 2010 - link
Correct. The only thing I really have to say about that is that while we include 1680 for reference's sake, for any review of a high-end video card I'm looking nearly exclusively at 1920 and 2560.Hrel - Thursday, September 2, 2010 - link
I get that, to test the card. But if you don't have a monitor that goes that high, it really doesn't matter. I'd really like to see 1080p thrown in there. 1920x1080; as that's the only resolution that matters to me and most everyone else in the US.Vinas - Sunday, March 28, 2010 - link
It's pretty obvious that anantech was spanked by nVIDIA the last time they did a review. No mention of 5970 being superior to the 480 is a little disturbing. I guess the days of "trusting anandtech" are over. Come on guys, not even a mention of how easily the 5870 overclocks? The choice is still clear, dual 5870's with full cover blocks FTW!