HD Video Playback: H.264 Blu-ray on the PC
by Derek Wilson on December 11, 2006 9:50 AM EST- Posted in
- GPUs
The Test
As we previously indicated, we need to use at least a Core 2 Duo E6400 in order to avoid dropping frames while testing graphics card decode acceleration under X-Men: The Last Stand. As we also wanted an accurate picture of how much GPU decode acceleration really helps, we needed to use a CPU powerful enough to avoid dropping frames even under the most stressful load without GPU assistance. Thus we chose the Core 2 Duo X6800 for our tests. Using this processor, we can more accurately see how each graphics card compares to the others and how much each graphics card is able to assist the CPU.
We tested CPU utilization by using perfmon to record data while we viewed a section of X-Men: The Last Stand. The bookmark feature really helped out, allowing us to easily jump to the specific scene we wanted to test in Chapter 18. In this scene, the Golden Gate is being torn apart and people are running everywhere. This is one of the most stressful scenes in the movie, reaching a bitrate of over 41 Mbps at one point.
Unfortunately, we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames. This means we can't really compare what happens to the video quality when the CPU is running at 100%. In lieu of dropped frames, we will need to stick with CPU overhead as our performance metric.
For reference we recorded average and maximum CPU overhead while playing back our benchmark clip with no GPU acceleration enabled.
Here is the rest of our test system:
As we previously indicated, we need to use at least a Core 2 Duo E6400 in order to avoid dropping frames while testing graphics card decode acceleration under X-Men: The Last Stand. As we also wanted an accurate picture of how much GPU decode acceleration really helps, we needed to use a CPU powerful enough to avoid dropping frames even under the most stressful load without GPU assistance. Thus we chose the Core 2 Duo X6800 for our tests. Using this processor, we can more accurately see how each graphics card compares to the others and how much each graphics card is able to assist the CPU.
We tested CPU utilization by using perfmon to record data while we viewed a section of X-Men: The Last Stand. The bookmark feature really helped out, allowing us to easily jump to the specific scene we wanted to test in Chapter 18. In this scene, the Golden Gate is being torn apart and people are running everywhere. This is one of the most stressful scenes in the movie, reaching a bitrate of over 41 Mbps at one point.
Unfortunately, we haven't found a feature in PowerDVD or another utility that will allow us to count dropped frames. This means we can't really compare what happens to the video quality when the CPU is running at 100%. In lieu of dropped frames, we will need to stick with CPU overhead as our performance metric.
For reference we recorded average and maximum CPU overhead while playing back our benchmark clip with no GPU acceleration enabled.
Here is the rest of our test system:
Performance Test Configuration | |
CPU: | Intel Core 2 Duo X6800 |
Motherboard(s): | ASUS P5B Deluxe |
Chipset(s): | Intel P965 |
Chipset Drivers: | Intel 7.2.2.1007 (Intel) |
Hard Disk: | Seagate 7200.7 160GB SATA |
Memory: | Corsair XMS2 DDR2-800 4-4-4-12 (1GB x 2) |
Video Cards: | Various |
Video Drivers: | ATI Catalyst 6.11 NVIDIA ForceWare 93.71 NVIDIA ForceWare 97.02 |
Desktop Resolution: | 1920x1080 - 32-bit @ 60Hz |
OS: | Windows XP Professional SP2 |
86 Comments
View All Comments
Stereodude - Wednesday, December 13, 2006 - link
Also, there's http://www.avsforum.com/avs-vb/showthread.php?p=91...">this post on AVSforum. The poster had no problems playing back Xmen-3 with a "P4 3.2Ghz HT system and a Radeon X1950Pro". Clearly a 3.2gHz HT P4 isn't nearly as powerful as any of those C2D processor nor was the X1950Pro as the various nVidia cards.Stereodude - Wednesday, December 13, 2006 - link
Perhaps, but nVidia intentionally sent them a H.264 torture test disc that's not available in the US. That also doesn't explain why the 7600GT nearly cut the CPU usage in half for one review, but only helped 20% in the other.Also, nVidia says an E6330 or X2 4200+ with a 7600GT is adequate for the most demanding H.264 titles. That sure doesn't agree with the conclusion of this Anandtech piece, which says you need a 8800GTX card to use a E6300.
balazs203 - Wednesday, December 13, 2006 - link
In the PC Perspective article they say:"In our testing the H.264 bit rates were higher than the VC-1 rates, in the high 18-19 Mbps up to 22 Mbps in some cases."
That is about half the maximum bitrate of the Anadtech tested disc.
Stereodude - Wednesday, December 13, 2006 - link
Since when does bitrate = difficulty to decode?DerekWilson - Thursday, December 14, 2006 - link
bitrate does equal difficulty to decode because it equals more to do per frame.frogge - Tuesday, December 12, 2006 - link
64 bit OS vs 32 bit...puffpio - Tuesday, December 12, 2006 - link
Will you start using more updated/modern encoding CPU tests for H.264 encoding? Currently you use Quicktime right? That doesn't use many of H264's advanced features.Have you considered using x264 (an open source encoder of H264 that generates the best quality encodes of publicly available H264 encoders) using a standard set of encoding parameters?
Nothing taxes a CPU better than video encoding :)
rain128 - Tuesday, December 12, 2006 - link
Im little bit sceptic about those test results. Becuse my Home computer on the subject line played Dejavu clip (downloaded from Apple website trailer 1 - 1080p) with CPU usage 40..60% and with current version of NVIDIA drivers. Wiht older drivers (dont know excact version, installed those over a year ago) average farame rate was between 50...70%.For a decoder used PowerDVD 7, installed trial and even when cyberlinks webpage says that H.264 codec doesnt work with trial version i had now problems with it. Gspot reported for a default rendering path Cyberlinks H.264 codec. For fulscreen capability used BSPlayer, strange was that Windows mediaplayer didnt want to play that trial eventhough all other players had no problem finding installed codecs.
TIP: with BSPlayer you can see droped frame rate count.
Renoir - Tuesday, December 12, 2006 - link
The h.264 clips on the apple website tend to have lower bit rates than those found on blu-ray discs so that explains your cpu usage.DerekWilson - Tuesday, December 12, 2006 - link
this is what we have found as well, and is also why looking at BD and HDDVD performance is more important than when we've looked at downloaded clips in the past