HD Video Decode Quality and Performance Summer '07
by Derek Wilson on July 23, 2007 5:30 AM EST- Posted in
- GPUs
Transporter 2 Trailer (High Bitrate H.264) Performance
This is our heaviest hitting benchmark of the bunch. Nestled into the recesses of the Blu-ray version of The League of Extraordinary Gentlemen (a horrible move if ever there was one) is a very aggressively encoded trailer for Transporter 2. This ~2 minute trailer is encoded with an average bitrate of 40 Mbps. The bitrate actually peaks at nearly 54 Mbps by our observation. This pushes up to the limit of H.264 bitrates allowed on Blu-ray movies, and serves as an excellent test for a decoder's ability to handle the full range of H.264 encoded content we could see on Blu-ray discs.
First up is our high performance CPU test (X6800):
Neither the HD 2900 XT nor the 8800 GTX feature bitstream decoding on any level. They are fairly representative of older generation cards from AMD and NVIDIA (respectively) as we've seen in past articles. Clearly, a lack of bitstream decoding is not a problem for such a high end processor, and because end users generally pair high end processors with high end graphics cards, we shouldn't see any problems.
Lower CPU usage is always better. By using an AMD card with UVD, or an NVIDIA card featuring VP2 hardware (such as the 8600 GTS), we see a significant impact on CPU overhead. While AMD does a better job at offloading the CPU (indicating less driver overhead on the part of AMD), both of these solutions enable users to easily run CPU intensive background tasks while watching HD movies.
Next up is our look at an affordable current generation CPU (E4300):
While CPU usage goes up across the board, we still have plenty of power to handle HD decode even without H.264 bitstream decoding on our high end GPUs. The story is a little different when we look at older hardware, specifically our Pentium 4 560 (with Hyper-Threading) processor:
Remember that these are average CPU utilization figures. Neither the AMD nor the NVIDIA high end parts are able to handle decoding in conjunction with the old P4 part. Our NetBurst architecture hardware just does not have what it takes even with heavy assistance from the graphics subsystem and we often hit 100% CPU utilization without one of the GPUs that support bitstream decoding.
Of course, bitstream decoding delivers in a HUGE way here, not only making HD H.264 movies watchable on older CPUs, but even giving us quite a bit of headroom to play with. We wouldn't expect people to pair the high end hardware with these low end CPUs, so there isn't much of a problem with the lack in this area.
Clearly offloading CABAC and CAVLC bitstream processing for H.264 was the right move, as the hardware has a significant impact on the capabilities of the system on the whole. NVIDIA is counting on bitstream processing for VC-1 not really making a difference, and we'll take a look at that in a few pages. First up is another H.264 test case.
This is our heaviest hitting benchmark of the bunch. Nestled into the recesses of the Blu-ray version of The League of Extraordinary Gentlemen (a horrible move if ever there was one) is a very aggressively encoded trailer for Transporter 2. This ~2 minute trailer is encoded with an average bitrate of 40 Mbps. The bitrate actually peaks at nearly 54 Mbps by our observation. This pushes up to the limit of H.264 bitrates allowed on Blu-ray movies, and serves as an excellent test for a decoder's ability to handle the full range of H.264 encoded content we could see on Blu-ray discs.
First up is our high performance CPU test (X6800):
Neither the HD 2900 XT nor the 8800 GTX feature bitstream decoding on any level. They are fairly representative of older generation cards from AMD and NVIDIA (respectively) as we've seen in past articles. Clearly, a lack of bitstream decoding is not a problem for such a high end processor, and because end users generally pair high end processors with high end graphics cards, we shouldn't see any problems.
Lower CPU usage is always better. By using an AMD card with UVD, or an NVIDIA card featuring VP2 hardware (such as the 8600 GTS), we see a significant impact on CPU overhead. While AMD does a better job at offloading the CPU (indicating less driver overhead on the part of AMD), both of these solutions enable users to easily run CPU intensive background tasks while watching HD movies.
Next up is our look at an affordable current generation CPU (E4300):
While CPU usage goes up across the board, we still have plenty of power to handle HD decode even without H.264 bitstream decoding on our high end GPUs. The story is a little different when we look at older hardware, specifically our Pentium 4 560 (with Hyper-Threading) processor:
Remember that these are average CPU utilization figures. Neither the AMD nor the NVIDIA high end parts are able to handle decoding in conjunction with the old P4 part. Our NetBurst architecture hardware just does not have what it takes even with heavy assistance from the graphics subsystem and we often hit 100% CPU utilization without one of the GPUs that support bitstream decoding.
Of course, bitstream decoding delivers in a HUGE way here, not only making HD H.264 movies watchable on older CPUs, but even giving us quite a bit of headroom to play with. We wouldn't expect people to pair the high end hardware with these low end CPUs, so there isn't much of a problem with the lack in this area.
Clearly offloading CABAC and CAVLC bitstream processing for H.264 was the right move, as the hardware has a significant impact on the capabilities of the system on the whole. NVIDIA is counting on bitstream processing for VC-1 not really making a difference, and we'll take a look at that in a few pages. First up is another H.264 test case.
63 Comments
View All Comments
bpt8056 - Monday, July 23, 2007 - link
Does it have HDMI 1.3??phusg - Monday, July 23, 2007 - link
Indeed, which makes it strange that he gave the nvidia cards 100% scores! Sure manual control on the noise filter is nice, but 100% is 100% Derek. It working badly when set above 75% makes for a less than perfect HQV score IMHO. Personally I would have gone with knocking off 5 points from the nvidia card's noise scores for this.
Scrogneugneu - Monday, July 23, 2007 - link
I would have cut points back too, but not because at 100% the image quality goes down. There's no sense in providing a slider if every position on the slider gives the same perfect image, doesn't it?Giving a slider, however, isn't very user-friendly, from an average Joe's perspective. I want to dump my movie in the player and listen to it, and I want it to look great. I do not want to move a slider around for every movie to get a good picture quality. Makes me think about the Tracking on old VHS. Quite annoying.
From a technological POV, yes, NVidia's implementation enables players to be great. From a consumer's POV, it doesn't. I wanna listen to a movie not fine tune my player.
Chunga29 - Monday, July 23, 2007 - link
It's all about the drivers, people! TechReport did their review with older drivers (at least on the NVIDIA side). So in the past two weeks, NVIDIA apparently addressed some problems and AT took a look at the current results. Probably delayed the article a couple times to rerun tests as well, I bet!As for the above comment about the slider, what you're failing to realize is that noise reduction impacts the final output. I believe Sin City used a lot of noise intentionally, so if you watch that on ATI hardware the result will NOT be what the director wanted. A slider is a bit of a pain, but then being a videophile is also a pain at times. With an imperfect format and imperfect content, we will always have to deal with imperfect solutions. I'd take NVIDIA here as well, unless/until ATI offers the ability to shut off NR.
phusg - Monday, July 23, 2007 - link
Hi Derek,Nice article, although I've just noticed a major omission: you didn't bench any AGP cards! There are AGP versions of the 2600 and 2400 cards and I think these are very attractive upgrades for AGP HTPC owners who are probably lacking the CPU power for full HD. The big question is whether the unidirectional AGP bus is up to the HD decode task. The previous generation ATi X1900 AGP cards reportedly had problems with HD playback.
Hopefully you'll be able to look into this, as AFAIK no-one else has yet.
Regards, Pete
ericeash - Monday, July 23, 2007 - link
i would really like to see these tests done on an AMD x2 proc. the core 2 duo's don't need as much offloading as we do.Orville - Monday, July 23, 2007 - link
Derek,Thanks so much for the insightful article. I’ve been waiting on it for about a month now, I guess. You or some reader could help me out with a couple of embellishments, if you would.
1.How much power do the ATI Radeon HD 2600 XT, Radeon HD 2600 Pro, Nvidia GeForce 6800 GTS and GeForce 6800 GT graphics cards burn?
2.Do all four of the above mentioned graphics cards provide HDCP for their DVI output? Do they provide simultaneous HDCP for dual DVI outputs?
3.Do you recommend CyberLink’s Power DVD video playing software, only?
Regards,
Orville
DerekWilson - Monday, July 23, 2007 - link
we'll add power numbers tonight ... sorry for the omissionall had hdcp support, not all had hdcp over dual-link dvi support
powerdvd and windvd are good solutions, but powerdvd is currently further along. we don't recommend it exclusively, but it is a good solution.
phusg - Wednesday, July 25, 2007 - link
I still can't see them, have they been added? Thanks.GlassHouse69 - Monday, July 23, 2007 - link
I agree here, good points.15% cpu utilization looks great until.... you find that a e4300 takes so little power that to use 50% of it to decode is only 25 watts of power. It is nice seeing things offloaded from the cpu.... IF the video card isnt cranking up alot of heat and power.