Overclocking Extravaganza: GTX 275's Complex Characteristics
by Derek Wilson on June 4, 2009 5:35 AM EST- Posted in
- GPUs
Core Overclocking
After G80 hit (the first NVIDIA GPU to employ a separate clock domain for shaders), silent shader clock speed increases were made with any core clock speed increase. At first this made sense because NVIDIA only exposed the ability to adjust core and memory clock and the shader clock was not directly adjustable by the end user. Of course, we went to some trouble back then to try our hand at BIOS flashing for shader overclocking. After NVIDIA finally exposed a separate shader adjustment, they still tied core clock and shader clock to some degree.
Since the middle of last year, NVIDIA's driver based clock speed adjustments have been "unlinked," meaning that the shader clock is not affected by the core clock as it used to be. This certainly makes things a lot easier for us, and we'll start by testing out core clock speed adjustment.
The maximum core clock we could hit on our reference GTX 275 was 702. Try as we might, we just could not get it stable beyond that speed. But it's still a good enough overclock for us to get a good handle on scaling. We know some people have GTX 275 parts that will get up toward 750 MHz, so it is possible to get more speed out of this. Still, we have an 11% increase in core clock speed which should net us some decent results.
1680x1050 1920x1200 2560x1600
Call of Duty edges up toward the theoretical maximum but drops off up at 2560x1600 which is much more resource intensive. Interestingly, most of the other games see more benefit at the highest resolution we test hitting over 5% there but generally between 2 and 5 percent at lower resolutions. FarCry 2 and Fallout 3 seem not to gain as much benefit from core overclocking as our other tests.
It could be that the fact we aren't seeing numbers closer to theoretical maximums because there is a bottleneck either in memory or in the shader hardware. This makes analysis a little more complex than with the AMD part, as there are more interrelated factors. Some aspects of a game could be accelerated, but if a significant amount of work is going on elsewhere, we'll still be waiting on one of the other subsystems.
Let's move on to the last independent aspect of the chip and then bring it all together.
43 Comments
View All Comments
balancedthinking - Thursday, June 4, 2009 - link
Actually, you can save as much as 40W(!)idle when you underclock a 4890/4870. The important part is to underlcock the GDDR5 which automatically adjusts voltage too.http://www.computerbase.de/news/hardware/grafikkar...">http://www.computerbase.de/news/hardwar...ten/ati/...
7Enigma - Friday, June 5, 2009 - link
Anyway you could translate the 4870 portion? Babelfish is not working for me for some reason....jeramhyde - Thursday, June 4, 2009 - link
great article thanks :-) good work with the graphs and setting it out in an easy to follow way :-)my housemate just got a gtx275 today, so we'll be trying those overclocks tonight :-)
nvalhalla - Thursday, June 4, 2009 - link
Alright, I just spent the last 10 minutes looking for a 900 shader 4980 before I realized you meant a 900MHz 4890. It's wrong in every graph.DerekWilson - Thursday, June 4, 2009 - link
It's not wrong, it's just noted in a different way.like I say (702 core) for the GTX 275 with a 702MHz core clock, i say (900 core) for a 4890 with a 900MHz core clock.
I'm sorry for the confusion, but it the graphs were already so crowded that I wanted to save space as much as I could.
nvalhalla - Thursday, June 4, 2009 - link
no, you're not getting me. It's listed as a 4980, not a 4890. I thought it was a new card, the "NEXT TIER" if you will. The 900 I thought might be a reference to the number of SP. Once I realized you just transposed the numbers, I got the 900 was MHz.DerekWilson - Thursday, June 4, 2009 - link
oooooooooooohhhhhhhhh ... okay ... that's a fun typo. I can't really do search and replace on these graphs either. I'll try and get it cleaned up as soon as I can.thanks for clarifying.
walp - Thursday, June 4, 2009 - link
Very nice article as always! :)GTX275 and 4890 is really evenly matched in every different way(Price, performance, overclocking performance etc) except for the fact that 4890 can be used with the 19$ Accelero S1:
http://translate.google.com/translate?prev=hp&...">http://translate.google.com/translate?p...mp;sl=sv...
, which makes it totally quiet and cool. Just watch those VRM-temperatures and you will be just fine!
This is the main reason why I chosed the 4890 over GTX275, and the fact that I had a CF-compatible motherboard.
By the way, why didnt you include the 4890 (1\1.2)GHz idle power draw? Or is it just a little type-o? :)
Request: GTX275 SLI vs 4890 CF (And overclocking bonanza! :)))))
\walp
SiliconDoc - Monday, June 22, 2009 - link
I can't help but use google and check the very first site that comes up:http://www.techspot.com/review/164-radeon-4890-vs-...">http://www.techspot.com/review/164-radeon-4890-vs-...
--
and the GTX275 beats the TAR out of the 4890 !!!
--
I guess derek has once again used some biased bench or the special from the manufacturer 4890, and then downclocked the GTX275 to boot.
WHAT A CROCK !
SiliconDoc - Saturday, June 6, 2009 - link
It's hilarious - the extravaganza overclock they do here for Nvidia can't even match the stock timings of a common EVGA.http://www.newegg.com/Product/Product.aspx?Item=N8...">http://www.newegg.com/Product/Product.aspx?Item=N8...
-
core 713 derek weakness red rooster nvidia hate 703
So what AT has done here is put the 4890 max overclocked in their other article against a low end stock gtx275, then in their extravaganza overclock gtx275 they put up a pathetic overclock that is BEATEN BY An EVGA gtx275 as it arrives !
Worse yet, they jammed their ati 4890 maxxxxx overclocks in for comparison !
------------
How to be a completely biased load of crap by Derek and AT :
max overclock your ati 4890 and put stock gtx275's against it
weak overclock your gtx275 and put maxx overclock 4890's against it
___
ROFLMAO - I HAVE TO LUAGH IT'S SO BLATANT AND PATHETIC.
---
cocka doodle doooo ! cocka doodle dooo ! red rooster central.