Overclocking Comparison
With overclocking, every GPU is different, so a side effect of looking at this is that we get a good idea of how the 6600GT will overclock in a general sense. We can't really say that all Albatron cards will overclock by 90MHz. Believe us when we say that if they all would run at that speed, they would all be running at that speed and out-selling the competition. There are a lot of factors that go into it. That's why we base most of our recommendation and ranking decisions on cooling and noise levels rather than overclocking. It is still a factor though.Those with a calculator handy will notice that the mean median and standard deviation are:
Mean: 568.10
Median: 571
Std. Dev.: 17.1840
Knowing NVIDIA, QA is going to assure that chips leaving labs will run at a little higher than stock clock speeds. This translates to a little bit of breathing room. What we pull away from this testing is that we expect Geforce 6600GT's to achieve a minium 7% overclock. A 9% to 12% overclock should be possible to most people who decide to own this card. Beyond that is icing on the cake. Of course, we are working with a very small sample size and we don't know much about the population as a whole either. We would have been more comfortable making predictions had this data looked more like a bell curve, but what we see is a little too flat for us to say anything with any statistical confidence.
Our memory clock speed graph shows Sparkle on top, but that's 2ns RAM on a 110MHz overclock. The XFX RAM is running 1.6ns RAM at a 10MHz overclock. This could be really lucky for Sparkle, but it isn't likely to happen on most boards. A 22% memory overclock, even with the added features of GDDR3, is still tough to pull off, especially when the 1.6ns memory only matched its performance. Inno3D also uses 1.6ns memory, but our final overclock ended up lower than the 600MHz that should have been possible with this part.
All the other solutions are 2ns memories which overclock between 50 and 100MHz. All the memories we looked at on 6600 GT boards are Samsung GDDR3 solutions.
84 Comments
View All Comments
Pete - Friday, December 10, 2004 - link
Obviously Derek OCed himself to get this article out, and he's beginning to show error. Better bump your (alarm) clocks down 10MHz (an hour) or so, Derek.pio!pio! - Friday, December 10, 2004 - link
Noticed a typo. At one point your wrote 'clock stock speed' instead of 'stock clock speed' easy mistake.Pete - Friday, December 10, 2004 - link
Another reason to narrow the distance b/w the mic and the noise source is that some of these cards may go into SFFs, or cases that sit on the desk. 12" may well be more indicative of the noise level those users would experience.Pete - Friday, December 10, 2004 - link
Great article, Derek!As usual, I keep my praise concise and my constructive criticism elaborate (although I could argue that the fact that I keep coming back is rather elaborate praise :)). I think you made the same mistake I made when discussing dB and perceived noise, confusing power with loudness. From the following two sources, I see that a 3dB increase equates to 2x more power, but is only 1.23x as loud. A 10db increase corresponds to 10x more power and a doubling of loudness. So apparently the loudest HSFs in this roundup are "merely" twice as loud as the quietest.
http://www.gcaudio.com/resources/howtos/voltagelou...
http://www.silentpcreview.com/article121-page1.htm...
Speaking of measurements, do you think 1M is a bit too far away, perhaps affording less precision than, say, 12"?
You might also consider changing the test system to a fanless PSU (Antec and others make them), with a Zalman Reserator cooling the CPU and placed at as great a distance from the mic as possible. I'd also suggest simply laying the test system out on a piece of (sound-dampening) foam, rather than fitting it in a case (with potential heat trapping and resonance). The HD should also be as quiet as possible (2.5"?).
I still think you should buy these cards yourselves, a la Consumer Reports, if you want true samples (and independence). Surely AT can afford it, and you could always resell them in FS/FT for not much of a loss.
Anyway, again, cheers for an interesting article.
redavnI - Thursday, December 9, 2004 - link
Very nice article, but any chance we could get a part 2 with any replacement cards the manufacturers send and I'd like the see the Pine card reviewed too. It's being advertised as the Anandtech Deal at the top of this article and has dual dvi like the XFX card. Kind of odd one of the only cards not reviewed gets a big fat buy me link.To me it seems that with the 6600GT/6800 series Nvidia has their best offering since the Geforce4 TI's...I'm sure I'm not the only one still hanging on to my Ti4600.
Filibuster - Thursday, December 9, 2004 - link
Something I've just realized: The Gigabyte NX66T256D is not a GT yet supports SLI. Are they using a GT that can't run at the faster speeds and selling it as a 6600 standard? It has 256MB.We ordered two from a vendor who said it definately does SLI.
http://www.giga-byte.com/VGA/Products/Products_GV-...
Can you guys find out for sure?
TrogdorJW - Thursday, December 9, 2004 - link
Derek, the "enlarged images" all seem to be missing, or else the links are somehow broken. I tested with Firefox and IE6 and neither one would resolve the image links.Other than that, *wow* - who knew HSFs could be such an issue? I'm quite surprised that they are only secured at two corners. Would it really have been that difficult to use four mount points? The long-term prospects for these cards are not looking too good.
CrystalBay - Thursday, December 9, 2004 - link
Great job on the quality control inspections of these cards D.W. Hopefully IHV's take notice and resolve these potentially damageing problems.LoneWolf15 - Thursday, December 9, 2004 - link
I didn't see a single card in this review that didn't have a really cheesey looking fan...the type that might last a couple years if you're really lucky, but might last six months on some cards if you're not. The GeForce 6600GT is a decent card; for $175-250 (depending on PCIe or AGP) you'd think vendors would put a fan deserving of the price. My PNY 6800NU came with a squirrel-cage fan and super heavy heatsink that I know will last. Hopefully, Arctic Cooling will come out with an NV Silencer soon for the 6600 family; I wouldn't trust any of the fans I saw here to last.Filibuster - Thursday, December 9, 2004 - link
What quality settings were used in the games?I am assuming that Doom 3 is in medium since these are 128MB cards.
I've read that there are some 6600GT 256MB cards coming out (Gigabyte GV-NX66T256D and MSI 6600GT-256E, maybe more) Please show us some tests with the 256MB models once they hit the streets (or if you know they are definately not, please tell us that too)
Even though the cards only have 128bit bus, wouldn't the extra ram help out in places like Doom 3 where texture quality is a matter of ram quantity? The local video ram still has to be faster than fetching it from system ram.