Vendor Cards: EVGA e-GeForce 7800 GTX
by Derek Wilson & Josh Venning on July 16, 2005 12:05 AM EST- Posted in
- GPUs
Heat, Power and Noise
Heat
NVIDIA has a handy tool that measures the temperature of your card in real time, and we used this to determine the heat level generated in the different states. Again, we used the looped Battlefield 2 demos to test the power and temperature loads. As you can see in the graph, there's no difference in the temperature of the reference card and the EVGA e-GeForce 7800 GTX in normal mode, and it only went up by one degree when we overclocked it. For reference, the temperature that we achieved while the system was idle was 46 degrees.
Power
To test the power usage of the card, we measured the total watt usage of the computer at the wall outlet. We documented the power usage of our CPU in 4 different states. The first state was when the computer was idle, and with only Windows running (i.e. no other programs running). Then we plugged in our reference card and ran a performance test for the second. The two last states included the EVGA e-GeForce 7800 GTX and running the test in both normal and overclocked modes, consecutively. This way, we were able to get a general idea of how the power usage varies with the cards, and how significant an impact that it might have.The power load that we had while the system was idle was 141 watts. So, you can see that both of the cards have a significant power draw. However, you wouldn't be able to determine precisely the power load of a card simply by subtracting the idle wattage from the overclocked wattage, since a faster GPU makes the CPU work harder to feed it data, meaning more memory accesses and an overall increase in power consumption for all components, and not just the GPU.
Noise
We didn't notice any issues with the sound level of the card. It wasn't quiet, but it wasn't as loud as some of the dust-buster solutions that we've seen in the past. The CPU fan was the dominant noise in the room, so we calculated the noise level of the card using a formula that we've employed in the past along with measurements of ambient SPL (everything off), ambient+CPU (computer on with no video) SPL, and total system SPL (everything on). We had to do this because we were unable to turn on the graphics card's fan without turning on the system.The end result gave us an SPL of 38.4 dB, which we will compare to other cards in future articles. This number will be on par with other 7800 cards that have reference-style heat sinks; but if nothing else, we will be able to get an idea of how much noise varies for this type of heat sink.
Those fairly conscious of their computer's power usage may be reluctant to overclock the card because of the jump in power load. And if your computer case has a circulation problem or you live somewhere really hot, it may not be advisable to overclock it. But for most people, the extra heat and wattage shouldn't have that great of an impact given the performance boost.
26 Comments
View All Comments
Fluppeteer - Friday, July 22, 2005 - link
Isn't that an old ATi card?I don't know about a 7800Ultra, but it looks like the Quadro 4500 (based on the 7800) might be on for a SIGGRAPH launch. Since the 4400's a 512MB card, I doubt the 4500 will be a 256MB one. And hopefully *that* will bode well for a 512MB consumer card.
Fingers crossed.
Mind you, if a 4500 is a 7800GTX-based card (as the 3400 is a 6800GTo card), perhaps there'll be a 5500 (7800Ultra-based) in the manner of the 4400. By which point, presumably people will have stopped selling GeForce 5500 cards, or it's going to get confusing (other than a factor of a hundred in the price).
araczynski - Thursday, July 21, 2005 - link
i think i'll wait for the 8800xyzFluppeteer - Wednesday, July 20, 2005 - link
I understand the eVGA 7800GTX card (unusually) has a dual-link DVI connection. Since this was a feature which seemed to cause a lot of confusion among 6800-series card manufacturers, I just wondered if the reviewers (or anyone else) had the chance to test it? If it *is* dual link is an external TMDS transmitter used? What's the quality of nVidia's TMDS transmitter implementation this time round (reports of the 6800 series were critical)?The 7800GTX is probably the best card out there for trying to render at the resolutions supported by an IBM T221 or Apple's 30" cinema display - although I'm inclined to wait for a 512Mb version for my T221; it would be good to know whether it's capable of driving one.
Cheers
smn198 - Tuesday, July 19, 2005 - link
A suggestion:Regarding measuring the card's noise output and the way you measured the sound
"We had to do this because we were unable to turn on the graphics card's fan without turning on the system."
Would it be possible to try and measure the voltages going to the fan when the card is idle and under full load? Then supply the fan with these voltages when the system is off using a different power supply such as a battery (which is silent) and a variable resister.
It would also be interesting to see a graph of how the noise increases when going from idle to full load over 10 minutes (or however long it takes to reach the maximum speed) on cards which have . Instead of trying to measure the noise with the system on, again measure the voltage over time and then using your battery, variable resistor and voltage meter recreate the voltages and use this in conjunction with the voltage/time data to produce noise/time data.
Thanks
PrinceGaz - Sunday, July 17, 2005 - link
Just look at the original review to see how the 7800GTX compares with older cards, they looked at a lot more games and with a wider range of settings.This series of articles is a comparison of 7800GTX cards and is meant to focus on the differences between them. We all know a 7800GTX is faster than a 6800U Ultra so there is no point including that on the graphs.
Zak - Sunday, July 17, 2005 - link
I agree, without any comparison to older cards this is pretty useless. Z.PrinceGaz - Sunday, July 17, 2005 - link
Any word from Derek or Josh(?) as to why AA was not enabled in the tests? It would certainly be a lot more meaningful than the curent set of results at resolutions as high as 2048x1536 without AA where the lowest average framerate in any game is over 70fps. The argument about 4fps more being worthwhile because it is an extra 240 frames per minute is one of the daftest things I've read in a gfx card review.Unless you include minimum framerates and ideally a framerate graph like [H} do, and comment on playability at different resolutions and AA settings; remarks like an overclocked card getting 76fps being wothwhile over the non-overclocked one only managing 72fps are ludicrous. I bet you couldn't even tell the two apart in a test where you weren't told which was which. Turn on 4x AA and lets see how they stand up. It may come down more to memory-bandwidth but thats okay. I'm sure some manufacturer will use Samsung's 1.4nS (1400MHz) chips, or at least their 1.5nS (1333MHz) chips sooner or later, assuming the core and circuit board are up to handling those speeds.
stephenbrooks - Sunday, July 17, 2005 - link
On page 5 (Heat, Power and Noise) it says under the first graph:-----
As you can see in the graph, there's no difference in the temperature of the reference card and the EVGA e-GeForce 7800 GTX in normal mode, and it only went up by one degree when we overclocked it.
-----
...that's not quite right, as in fact both the e-GeForces were at 81C (overclocked or not) whereas teh reference card was at 80C.
overclockingoodness - Sunday, July 17, 2005 - link
#9 and #17: If you want to see the numbers, maybe you should go read the original 7800GTX review. These are just vendor series and you they are comparing the vendor's performance, which is always going to be a couple of frames here and there. It's useless to include 6800 and ATI cards in there.z0mb1e - Sunday, July 17, 2005 - link
I agree with #9, it would be nice if it had some numbers from the 6800 and an ATI card