HD Video Decode Quality and Performance Summer '07
by Derek Wilson on July 23, 2007 5:30 AM EST- Posted in
- GPUs
Serenity (VC-1) Performance
We haven't yet found a VC-1 title to match either of the H.264 titles we tested in complexity or bitrate, so we decided to stick with our tried and true test of Serenity. The main event here is in determining the real advantage of including VC-1 bitstream decoding on the GPU. NVIDIA's claim is that this is not as complex as it is under H.264 so it isn't necessary. AMD is pushing their solution as more complete, but does it really matter? Let's take a look.
Our HD 2900 XT has the highest CPU utilization, while the 8600 GTS and 8800 GTS share roughly the same performance. The HD 2600 XT leads the pack with an incredibly low CPU overhead of just 5 percent. This is probably approaching the minimum overhead of AACS handling and disk accesses through PowerDVD, which is very impressive. At the same time, the savings with GPU bitstream decode are not as impressive under VC-1 as on H.264 on the high end.
Dropping down in processor power doesn't heavily impact CPU overhead in the case of VC-1.
Moving all the way down to a Pentium 4 based processor, we do see higher CPU utilization across the board. The difference isn't as great as under H.264, and, not only that, but VC-1 movies appear to remain very playable on this hardware even without bitstream decoding on the GPU. This is not the case for our H.264 movies. While we wouldn't recommend it with the HD 2900 XT, we could even consider looking at a (fairly fast) single core CPU the other hardware, with or without full decode acceleration.
We haven't yet found a VC-1 title to match either of the H.264 titles we tested in complexity or bitrate, so we decided to stick with our tried and true test of Serenity. The main event here is in determining the real advantage of including VC-1 bitstream decoding on the GPU. NVIDIA's claim is that this is not as complex as it is under H.264 so it isn't necessary. AMD is pushing their solution as more complete, but does it really matter? Let's take a look.
Our HD 2900 XT has the highest CPU utilization, while the 8600 GTS and 8800 GTS share roughly the same performance. The HD 2600 XT leads the pack with an incredibly low CPU overhead of just 5 percent. This is probably approaching the minimum overhead of AACS handling and disk accesses through PowerDVD, which is very impressive. At the same time, the savings with GPU bitstream decode are not as impressive under VC-1 as on H.264 on the high end.
Dropping down in processor power doesn't heavily impact CPU overhead in the case of VC-1.
Moving all the way down to a Pentium 4 based processor, we do see higher CPU utilization across the board. The difference isn't as great as under H.264, and, not only that, but VC-1 movies appear to remain very playable on this hardware even without bitstream decoding on the GPU. This is not the case for our H.264 movies. While we wouldn't recommend it with the HD 2900 XT, we could even consider looking at a (fairly fast) single core CPU the other hardware, with or without full decode acceleration.
63 Comments
View All Comments
smitty3268 - Monday, July 23, 2007 - link
No. The 2400 and 2600 have support for Avivo HD feature set even with VC-1 decoding, while the G84 and G86 don't so their quote is correct. If a little confusing, since Avivo is ATI terminology. Nevertheless, it is basically equivalent to the NVIDIA hardware.scosta - Monday, July 23, 2007 - link
I think this sentence in page 1 is wrong!<blockquote>While the R600 based Radeon HD 2900 XT only supports the features listed as "Avivo", <b>G84 and G86<\b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>
Dont you mean ...
<blockquote>the features listed as "Avivo", <b>HD 2400 and HD 2600</b> based hardware comprise the Avivo HD feature set (100% GPU offload) for all but VC-1 decoding ...<\blockquote>
Regards
iwodo - Monday, July 23, 2007 - link
May be i am the only one who doesn't understand why would they not recommend a Geforce 8500 for Low end machine?
Chunga29 - Monday, July 23, 2007 - link
The NVIDIA 8500 drivers are not currently working with PureVideo HD, I believe was mentioned.ssiu - Monday, July 23, 2007 - link
NVIDIA PureVideo HD still doesn't support Windows XP, correct? That would be the deciding factor for many people (instead of a noise reduction score of 15% versus 25% etc.)legoman666 - Monday, July 23, 2007 - link
this man hit the nail on the head. A couple months ago i was on the verge of buying a new video card for my htpc with h.264 acceleration, but upon learning that those features were only enabled for vista (bleh) I decided not to upgrade at all.DigitalFreak - Monday, July 23, 2007 - link
Any ideas as to why the HQV scores are almost totally opposite of what http://techreport.com/reviews/2007q3/radeon-hd-240...">The Techreport came up with? I'd trust AT's review more, but it seems strange that the scores are so different.phusg - Monday, July 23, 2007 - link
Yes very interesting! FTA:DigitalFreak - Monday, July 23, 2007 - link
I'm wondering if they ran with the noise filter at over 75% in their test. As Derek mentioned, higher than 75% produced banding. I also noticed that Derek used 163.x drivers, while TR used 162.x.Honestly, I wish there was an 8600 GT/GTS with HDMI out. Would really love to avoid running two cables to my receiver.
Gary Key - Monday, July 23, 2007 - link
There will be in about 60 days, hardware is sampling now. ;)