NVIDIA's PureVideo HD: HD-DVD Playback on the PC
by Derek Wilson on July 22, 2006 10:00 AM EST- Posted in
- GPUs
Final Words
So the burning question on everyone's mind is: what does all this mean for the user who wants to play HD content on their PC? A graphics card that can accelerate the playback of 1080p HDCP content will be something hardcore enthusiasts will want. While what we saw in our tests shows that the GPU can only really "relieve" 20% of the total CPU load on a Pentium D 830 (2 NetBurst cores running at 3.0GHz each), this early content isn't encoded at anywhere near the highest bit-rates supported on HD media, and NVIDIA expects to move more of the pipeline onto the GPU with future driver updates.
If the system will be used to play back imported Japanese films, the need for GPU decode acceleration is increased. As our tests show, the system pegged the CPU without GPU assistance and frames were dropping left and right. The fact that Japan is using H.264 for all their content does give decoders a harder time. Granted, we didn't use the fastest CPU around, but the number of dropped frames did render the movie unviewable.
Let us reiterate that while the videos we recorded do demonstrate the difference in the viewing experience with and with a GPU on the D 830, compressing video caught at 30 fps with a DV cam of content being played back at 60fps on a TV is inevitably going to smooth over some of the motion flaws in the original. The differences are much more dramatic in person.
Of course, there is some question of how other CPUs will handle the content, and we haven't had a chance to thoroughly investigate the matter yet. A Pentium D 830 is no slouch of a CPU, but neither is it extremely fast. Depending on the decoding algorithm (i.e. CPU optimizations) being used, many dual core processors out there may outperform the Pentium D 830 - but we will have to investigate this further when we have hardware. It almost goes without saying that we fully expect even the lowliest of Core 2 Duo processors to be able to handle 1080p content (with any encoding), though they will still likely be very close to 100% CPU usage. For those of you still running single core CPUs, things aren't looking too good right now as far as high definition support. It appears that NVIDIA, ATI, or someone else is going to need to do far more than offloading 20% of the CPU requirements before any single core CPU is going to be able to manage 1080p decoding without dropping frames.
Anything less heavy duty than H.264 (read: all current American content) is watchable without GPU accelerated decoding enabled on the system we tested. VC-1 seemed to run near the limits of the system, but didn't run into the same trouble we saw while watching the Japanese version of The Chronicles of Riddick. For the general American HD content consumer with a PC, a decent (dual core) midrange system will be able to playback video just fine.
Right now doing anything while watching HD content isn't a good idea. If NVIDIA moves more decode onto GPU, we could free up resources for background tasks. Lack of power savings and low bit-rate content diminish the need for GPU decode on most current CPUs right now unless Japanese importing is important (larger regions make this easier).
It may still be possible to build a quieter system using PureVideo HD because, while power isn't saved over the whole system, all the power isn't dissipated in the same spot. This could lead to relaxed cooling requirements. In fact, there are a good number of silent 7300GS cards that run at over 500 MHz. While they don't have enough pixel power to run the latest games at any decent quality or resolution, the clock speed makes it an excellent option for PureVideo HD (provided one of the vendors making HDCP cards opts to build a 7300GS). As for cards that are coming out soon, MSI and ASUS both have 7600 based products with HDCP planned for the near future. MSI even has an HDMI product coming down the pipe Real Soon Now.
PureVideo HD is a very good thing. We would love to see NVIDIA pull more of the decode pipeline onto the GPU, and CyberLink could still benefit from some time improving PowerDVD. Naturally, as this is all still beta, we can cut them a little bit of slack. However, once players are available in good quantities for decent prices with competition from ATI's AVIVO thrown in for good measure, we expect to see improvement.
We are very interested in seeing how ATI's AVIVO compares to PureVideo HD. As soon as we are able, we will have a comparison of the two, and we will also test with additional CPUs. Until then, HDCP support is a good thing, PureVideo HD nice, and the near term HDMI cards will also be useful for the home theater crowd. However, for most of us, at this point these things are merely interesting features. It's a little bit early to make a recommendation on buying HDCP enabled hardware for the multimedia enthusiast, especially given the current cost of optical drives. If this is something you need, the best bet will be to wait until everything is available in retail and we've seen the cards ATI is holding.
So the burning question on everyone's mind is: what does all this mean for the user who wants to play HD content on their PC? A graphics card that can accelerate the playback of 1080p HDCP content will be something hardcore enthusiasts will want. While what we saw in our tests shows that the GPU can only really "relieve" 20% of the total CPU load on a Pentium D 830 (2 NetBurst cores running at 3.0GHz each), this early content isn't encoded at anywhere near the highest bit-rates supported on HD media, and NVIDIA expects to move more of the pipeline onto the GPU with future driver updates.
If the system will be used to play back imported Japanese films, the need for GPU decode acceleration is increased. As our tests show, the system pegged the CPU without GPU assistance and frames were dropping left and right. The fact that Japan is using H.264 for all their content does give decoders a harder time. Granted, we didn't use the fastest CPU around, but the number of dropped frames did render the movie unviewable.
Let us reiterate that while the videos we recorded do demonstrate the difference in the viewing experience with and with a GPU on the D 830, compressing video caught at 30 fps with a DV cam of content being played back at 60fps on a TV is inevitably going to smooth over some of the motion flaws in the original. The differences are much more dramatic in person.
Of course, there is some question of how other CPUs will handle the content, and we haven't had a chance to thoroughly investigate the matter yet. A Pentium D 830 is no slouch of a CPU, but neither is it extremely fast. Depending on the decoding algorithm (i.e. CPU optimizations) being used, many dual core processors out there may outperform the Pentium D 830 - but we will have to investigate this further when we have hardware. It almost goes without saying that we fully expect even the lowliest of Core 2 Duo processors to be able to handle 1080p content (with any encoding), though they will still likely be very close to 100% CPU usage. For those of you still running single core CPUs, things aren't looking too good right now as far as high definition support. It appears that NVIDIA, ATI, or someone else is going to need to do far more than offloading 20% of the CPU requirements before any single core CPU is going to be able to manage 1080p decoding without dropping frames.
Anything less heavy duty than H.264 (read: all current American content) is watchable without GPU accelerated decoding enabled on the system we tested. VC-1 seemed to run near the limits of the system, but didn't run into the same trouble we saw while watching the Japanese version of The Chronicles of Riddick. For the general American HD content consumer with a PC, a decent (dual core) midrange system will be able to playback video just fine.
Right now doing anything while watching HD content isn't a good idea. If NVIDIA moves more decode onto GPU, we could free up resources for background tasks. Lack of power savings and low bit-rate content diminish the need for GPU decode on most current CPUs right now unless Japanese importing is important (larger regions make this easier).
It may still be possible to build a quieter system using PureVideo HD because, while power isn't saved over the whole system, all the power isn't dissipated in the same spot. This could lead to relaxed cooling requirements. In fact, there are a good number of silent 7300GS cards that run at over 500 MHz. While they don't have enough pixel power to run the latest games at any decent quality or resolution, the clock speed makes it an excellent option for PureVideo HD (provided one of the vendors making HDCP cards opts to build a 7300GS). As for cards that are coming out soon, MSI and ASUS both have 7600 based products with HDCP planned for the near future. MSI even has an HDMI product coming down the pipe Real Soon Now.
PureVideo HD is a very good thing. We would love to see NVIDIA pull more of the decode pipeline onto the GPU, and CyberLink could still benefit from some time improving PowerDVD. Naturally, as this is all still beta, we can cut them a little bit of slack. However, once players are available in good quantities for decent prices with competition from ATI's AVIVO thrown in for good measure, we expect to see improvement.
We are very interested in seeing how ATI's AVIVO compares to PureVideo HD. As soon as we are able, we will have a comparison of the two, and we will also test with additional CPUs. Until then, HDCP support is a good thing, PureVideo HD nice, and the near term HDMI cards will also be useful for the home theater crowd. However, for most of us, at this point these things are merely interesting features. It's a little bit early to make a recommendation on buying HDCP enabled hardware for the multimedia enthusiast, especially given the current cost of optical drives. If this is something you need, the best bet will be to wait until everything is available in retail and we've seen the cards ATI is holding.
45 Comments
View All Comments
yzkbug - Saturday, July 22, 2006 - link
I have a $5K question (the cost of a new TV) unanswered by this article. Do you absolutely need an HDCP-enabled TV to watch HD movies on PC? The slide on page 2 shows that a monitor can be connected either via Analog (VGA or Component) or via Digital (DVI or HDMI with HDCP). So, does it mean that it is possible to watch HD over Analog without any PQ degradation? Also, does it mean that DVI without HDCP is a no-go?Bowsky - Saturday, July 22, 2006 - link
HDCP is only required to view HD content is an Image Contrait Tokien (ICT, I think thats its name) is present on the disc. If it's not there the media can be played at full resolution over any connect such as nonHDCP-DVI, VGA, component, etc.To answer your $5,000 question, the movie companies have decided to wait until 2010 before using the ICT on any media. After that all media will be down-scaled if played over non-HDCP connections. So my answer to you is buy the HDCP television set. It won't be required immediately, but will unfortunately be required in the near future.
Also, most new HDTVs on sale these days have HDCP so ther shouldn't be too much to worry about when buying.
DerekWilson - Sunday, July 23, 2006 - link
Our understanding of the situation is that any DIGITAL playback requires HDCP or no image will be displayed (under current PC video player technology -- downscaling may be possible in the future).Currently all titles will be able to play full resolution Analog (component, vga), but in the future this will not be allowed either.
non-HDCP DVI and non-HDCP HDMI will never playback full resolution HD content distributed on HD media with HDCP protection enabled. This is essentially all titles.
If you want digital playback of HDs or BDs, you can't do it without an HDCP television. If you don't mind analog playback, your fine for the next few years.
Renoir - Sunday, July 23, 2006 - link
Just so I understand you Derek are you saying that full resolution playback over a digital connection will not be allowed regardless of whether the image constraint token is used or not and that full res will only be allowed over analogue until the ICT is used? If so that sucks big time. Would then have to hook up my monitor via both vga and dvi depending on if I'm watching a hd disc or not (assuming of course I still have my current monitor by the time I watch hd discs off course).DerekWilson - Sunday, July 23, 2006 - link
this is the way I understand it.Renoir - Monday, July 24, 2006 - link
Bummer! Was hoping the lack of the ICT would allow me to use dvi at full res. I imagine it's because they're more worried about people getting a perfect digital copy rather than capturing the analogue at full res and then converting it to digital. However I'm not aware of any dvi capturing devices although there are plenty of component ones. Does anyone know of any hi res digital capturing devices as I'm curious now :-)DerekWilson - Monday, July 24, 2006 - link
we wanted to build one to analyse video output of graphics cards without relying on screen capture utilities ... it shouldn't really be that difficult.Renoir - Monday, July 24, 2006 - link
cool! Perhaps in terms of piracy they feel that it's less neccesary to protect the full res analogue video than a bit for bit accurate dvi feed. If so then they must be thinking that people would find pirated videos that were redigitised from component etc (albeit at full res) less compelling than one straight from dvi. What other reason/s do they have for allowing full res over analogue but not over digital?Renoir - Monday, July 24, 2006 - link
Just realised I haven't taken any analogue copy protection such as macrovision into account. Any info on this aspect? If it's present then that would pretty much answer my last question.vhx500 - Saturday, July 22, 2006 - link
On page 2, you mention plaing Riddick and Swordfish, but you are displaying a screenshot from The Bourne Supremacy?