AMD 780G: Preview of the Best Current IGP Solution
by Gary Key on March 10, 2008 12:00 PM EST- Posted in
- CPUs
Testing Notes
Our image quality tests today consists of still screenshots from several movies in our collection, actually the primary criteria were movies that offered decent bitrates and were screen capture friendly under DRM infested Vista. While we tried to capture the exact moment within a frame to compare the 780G to the G35/GF8200, this was not always possible. However, we did our best to capture the screenshots on the same frame but at times; the capture process grabbed the beginning of the frame or the trailing end.
In the end, we feel like the process is close enough to give you an indication to any differences between the images. Obviously, we cannot show the reference image but will comment as to which screen shot best represented the reference image on our home theater system. We find measuring video output quality in this manner a subjective process to some degree, as at times we liked an image that did not match the reference screenshot better than the one that did. Considering that fact, we decided to bring eight people in to give us their opinions during playback sequences and then still-shot reviews.
We utilized PowerDVD 7.3 (build 3610 utilized, 3730 in testing) for playback with all settings on auto, except we enabled hardware acceleration within the application. We measured CPU utilization and bitrate results with the Vista Task Manager and PowerDVD Info applications respectively. We took readings every five minutes and averaged the results at the end of the movie. Movies were played back in full screen mode at their native resolutions with the desktop set to 1920x1200. We will show discrete card results in our next article. However, the HD 3450 provided results that were just a few percent better than the HD 3200 with the released 8.3 driver set.
Audio settings selected were Dolby Digital 5.1, DTS 5.1, or two-channel LPCM where applicable. Unlike the 780G, the G35 and GeForce 8200 fully supports multi-channel LPCM output and we will comment on results with either LPCM 5.1 or Dolby TrueHD decoded streams (via PowerDVD) where applicable.
We calibrated both systems on our Gateway XHD3000 monitor to ensure the color palettes were set equally. Fine tuning the control panel settings for each of the test setups can and will improve the picture quality but we will leave that up to the user to decide their preferences. We played back the movie sequences on the Gateway monitor for initial viewing and judging. We proceeded to output the same movie sequences on our calibrated Samsung 61" (HL-T6187S) DLP set utilizing the same control panel and PowerDVD settings to view the images in an HT setting.
We then utilized our test setups to pass through the image and audio via HDMI to our Pioneer Elite VSX-94THX A/V receiver. Using the receiver as a "repeater", we output the native signal (our preferred method to PWDVD) to our HT setup (7.1 audio/Samsung 61” DLP) for our test subjects. We will list their preferences in our comments section. We will also warn you in advance that the original images are generally 2MB to 3MB in size if you decide to download them.
MPEG 2 Video Quality - SWORDFISH
We are utilizing the movie SWORDFISH from Warner Brothers, what else can we say, we still like seeing Halle Berry in the lounge chair. This movie offers bitrate levels that averaged 5.4 Mb/s to 8.1 Mb/s on average. In our particular test scene, John Travolta is drinking a cup of coffee in a close up shot that highlights skin tones along with reflections off the cup and background objects.
780G – Click to Enlarge |
G35 – Click to Enlarge
|
GeForce 8200 – Click to Enlarge |
The differences in the images are minor but the G35 appears to have slightly deeper colors along with a slight edge in sharpness while the GeForce 8200 offers additional contrast and a better background image, but being darker overall. However, the 780G image was faithful to the reference image during playback tests. Our test audience placed 4 votes for the 780G, 2 for the G35, and 2 for the GeForce 8200.
CPU utilization during playback favors the 780G and GeForce 8200 on average by 4%. This was surprising to us as the G35 does not offer full hardware decode capabilities for MPEG-2 playback. However, we never had a problem with playback in this title and others such as Cars and Spiderman 3.
VC1 Video Quality - SWORDFISH
SWORDFISH is not a very demanding movie in 1080P playback but the image screenshots should give a good indication to the improvement when going from MPEG-2 to VC1. This movie offers bitrate levels that averaged 15.7 Mb/s to 25.2 Mb/s on average. Once again, we utilize the coffee shop test scene.
780G – Click to Enlarge |
G35 – Click to Enlarge |
GeForce 8200 – Click to Enlarge |
The differences in the images are once again minor but this time the 780G appears to have better skin tones while sharpness and overall color palette seems to favor the G35. We thought the GeForce 8200 image was slightly on the flat side when comparing facial details but had the best background details. The 780G image was more faithful to the reference image during playback tests. The 780G garnered 3 votes, 3 to the G35, and 2 for the GeForce 8200.
CPU utilization during playback favors the GeForce 8200 and 780G by a several percent even though the G35 offers hardware accelerated decoding of the VC1 format. There were not any judder or stuttering problems on any of the platforms during playback.
49 Comments
View All Comments
- Monday, March 10, 2008 - link
Where is the discussion of this chipset as an HTPC? Just a tidbit here and there? I thought that was a major selling point here. With a single core sempron 1.8ghz being enough for an HTPC which NEVER hits 100% cpu usage (see tomshardware.com) you don't need a dual core and can probably hit 60w in your HTPC! Maybe less. Why was this not a major topic in this article? With you claiming the E8300/E8200 in your last article being a HTPC dreamers chip shouldn't you be talking about how low you could go with a sempron 1.8ghz? Isn't that the best HTPC combo out there now? No heat, super low cost running it all year long etc (NOISELESS with a proper heatsink).Are we still supposed to believe your article about the E8500? While I freely admit chomping at the bit to buy an E8500 to Overclock the crap out of it (I'm pretty happy now with my e4300@3.0 and can't wait for 3.6ghz with e8500, though it will go further probably who needs more than 3.6 today for gaming), it's a piece of junk for an HTPC. Overly expensive ($220? for e8300 that was recommended) compared to a lowly Sempron 1.8 which I can pick up for $34 at newegg. With that kind of savings I can throw in a 8800GT in my main PC as a bonus for avoiding Intel. What's the point in having an HTPC where the cpu utilization is only 25%? That's complete OVERKILL. I want that as close to 100% as possible to save me money on the chip and then on savings all year long with low watts. With $200 savings on a cpu I can throw in an audigy if needed for special audio applications (since you whined about 780G's audio). A 7.1channel Audigy with HD can be had for $33 at newegg. For an article totally about "MULTIMEDIA OUTPUT QUALITIES" where's the major HTPC slant?
sprockkets - Thursday, March 13, 2008 - link
Dude, buy a 2.2ghz Athlon X2 chip for like $55. You save what, around $20 or less with a Sempron nowadays?QuickComment - Tuesday, March 11, 2008 - link
It's not 'whining' about the audio. Sticking in a sound card from Creative still won't give 7.1 sound over HDMI. That's important for those that have a HDMI-amp in a home theatre setup.TheJian - Tuesday, March 11, 2008 - link
That amp doesn't also support digital audio/Optical? Are we just talking trying to do the job within 1 cable here instead of 2? Isn't that kind of being nit picky? To give up video quality to keep in on 1 cable to me is unacceptable (hence I'd never "lean" towards G35 as suggested in the article). I can't even watch if the video sucks.QuickComment2 - Tuesday, March 11, 2008 - link
No, its not about 1 cable instead of 2. SPDIF is fine for Dolby digital and the like, ie compressed audio, but not for 7.1 uncompressed audio. For that, you need HDMI. So, this is a real deal-breaker for those serious about audio.JarredWalton - Monday, March 10, 2008 - link
I don't know about others, but I find video encoding is something I do on a regular basis with my HTPC. No sense storing a full quality 1080i HDTV broadcast using 16GB of storage for two hours when a high quality DivX or H.264 encode can reduce disk usage down to 4GB, not to mention ripping out all the commercials. Or you can take the 3.5GB per hour Windows Media Center encoding and turn that into 700MB per hour.I've done exactly that type of video encoding on a 1.8GHz Sempron; it's PAINFUL! If you're willing to just spend a lot of money on HDD storage, sure it can be done. Long-term, I'm happier making a permanent "copy" of any shows I want to keep.
The reality is that I don't think many people are buying HTPCs when they can't afford more than a $40 CPU. HTPCs are something most people build as an extra PC to play around with. $50 (only $10 more) gets you twice the CPU performance, just in case you need it. If you can afford a reasonable HTPC case and power supply, I dare say spending $100-$200 on the CPU is a trivial concern.
Single-core older systems are still fine if you have one, but if you're building a new PC you should grab a dual-core CPU, regardless of how you plan to use the system. That's my two cents.
TheJian - Tuesday, March 11, 2008 - link
I guess you guys don't have a big TV. With a 65in Divx has been out of the question for me. It just turns to crap. I'd do anything regarding editing on my main PC with the HTPC merely being a cheap player for blu-ray etc. A network makes it easy to send them to the HTPC. Just set the affinity on one of your cores to vidcoding and I can still play a game on the other. Taking 3.5GB to 700MB looks like crap on a big tv. I've noticed it's watchable on my 46in, but awful on the 65. They look great on my PC, but I've never understood anyone watching anything on their PC. Perhaps a college kid with no room for a TV. Other than that...JarredWalton - Tuesday, March 11, 2008 - link
SD resolutions at 46" (what I have) or 65" are always going to look lousy. Keeping it in the original format doesn't fix that; it merely makes to use more space.My point is that a DivX, x64, or similar encoding of a Blu-ray, HDTV, or similar HD show loses very little in overall quality. I'm not saying take the recording and make it into a 640x360 SD resolution. I'm talking about converting a full bitrate 1080p source into a 1920x1080 DivX HD, x64, etc. file. Sure, there's some loss in quality, but it's still a world better than DVD quality.
It's like comparing a JPEG at 4-6 quality to the same image at 12 quality. If you do a diff, you will find lots of little changes on the lower quality image. If you want to print up a photo, the higher quality is desirable. If you're watching these images go by at 30FPS, though, you won't see much of a loss in overall quality. You'll just use about 1/3 the space and bandwidth.
Obviously, MPEG4 algorithms are *much* more complex than what I just described - which is closer to MPEG2. It's an analogy of how a high quality HD encode compares to original source material. Then again, in the case of HDTV, the original source material is MPEG2 encoded and will often have many artifacts already.
yehuda - Monday, March 10, 2008 - link
Great article. Thanks to Gary and everyone involved! The last paragraph is hilarious.One thing that bothers me about this launch is the fact that board vendors do not support the dual independent displays feature to full extent.
If I understand the article correctly, the onboard GPU lets you run two displays off any combination of ports of your choice (VGA, DVI, HDMI or DisplayPort).
However, board vendors do not let you do that with two digital ports. They let you use VGA+DVI or VGA+HDMI, but not DVI+HDMI. At least, this is what I have gathered reading the Gigabyte GA-MA78GM-S2H and Asus M3A78-EMH-HDMI manuals. Please correct me if I'm wrong.
How come tier-1 vendors overlook such a worthy feature? How come AMD lets them get away with it?
Ajax9000 - Tuesday, March 11, 2008 - link
They are appearing. At CeBIT Intel showed off two mini-ITX boards with dual digital.DQ45EK DVI+DVI
DG45FC DVI+HDMI
http://www.mini-itx.com/2008/03/06/intels-eaglelak...">http://www.mini-itx.com/2008/03/06/intels-eaglelak...