HD Video Decode Quality and Performance Summer '07
by Derek Wilson on July 23, 2007 5:30 AM EST- Posted in
- GPUs
Final Words
While noise reduction can be a good thing, when viewing well mastered and high quality compressed HD video, noise should be kept at a minimum anyway. We've seen our fair share of early HD releases where noise is simply atrocious, however, and we expect that it will take some studios a little time to adjust to the fact that higher resolution movies not only look better, but reveal flaws more readily as well. For now (especially for movies like X-Men 3), noise reduction is highly appreciated. But down the line we hope that studios will put a bit more effort into delivering a polished product.
There are cases where blending effects require a bit of added noise to give scenes a more natural feel. Noise can even be cranked way up by a director to provide an artistic or dated effect. In these cases (which will hopefully be most cases where noise is evident in the future), we want to view HD material as it was delivered. When presented with poor post processing from a studio it is nice to have the ability to make our own decisions on how we want to view the content. These facts make it clear to us that the ability to enable or disable noise reduction is an imperative feature for video processors. While fully adjustable noise reduction might not be as necessary, it is absolutely appreciated and offers those who know what they are doing the highest potential image quality across every case.
Those who choose to stick with very well produced 1080p content may not need post processing noise reduction or deinterlacing, but they might miss out on imported content or HD releases of some TV series (depending on what studios choose to do in that area). For now, we're going to recommend that users interested in HTPC setups stick with the tools that can get the job done best no matter what the source material is. The only options for HD video intensive systems today are the Radeon HD 2600 and GeForce 8600 series cards. For its better handling of noise reduction (and especially the fact that it can be turned off) we recommend the 8600 GT/GTS above the other options in spite of the fact that the 2600 XT provided better CPU offloading.
We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.
AMD's UVD does beat out NVIDIA's VP2 in both H.264 and VC-1 decode performance. However, it isn't really enough to make a tangible difference in the viewing of movies. Performance is important, and UVD performance is certainly impressive. But we still have to favor the 8600 for its superior image quality.
VC-1 bitstream decoding doesn't have as large an impact as H.264 bitstream decoding. We would have to drop down to a significantly slower CPU in order for the difference to offer AMD an advantage. In the scenarios we tested, we feel that NVIDIA didn't make a serious blunder in skipping the inclusion of hardware to handle VC-1 bitstreams. At least, they didn't make as serious a blunder as AMD did by not including UVD in their HD 2900 XT.
In the future, we won't "need" H.264 or VC-1 decode on our GPUs either (just as we don't "need" MPEG-2 acceleration for current CPUs), but we don't see this as a valid excuse not to provide a full range of functionality for end users. And need is a relative term at best. We can do good realtime 3D on CPUs these days, but we don't see graphics card companies saying "this card will be paired with a high end CPU so we decided not to implement [insert key 3D feature] in hardware." We want to see AMD and NVIDIA include across the board support for video features in future product lineups. Saving CPU cycles isn't an exclusive desire of owners of low end hardware, and when we buy higher end hardware we expect higher performance.
While noise reduction can be a good thing, when viewing well mastered and high quality compressed HD video, noise should be kept at a minimum anyway. We've seen our fair share of early HD releases where noise is simply atrocious, however, and we expect that it will take some studios a little time to adjust to the fact that higher resolution movies not only look better, but reveal flaws more readily as well. For now (especially for movies like X-Men 3), noise reduction is highly appreciated. But down the line we hope that studios will put a bit more effort into delivering a polished product.
There are cases where blending effects require a bit of added noise to give scenes a more natural feel. Noise can even be cranked way up by a director to provide an artistic or dated effect. In these cases (which will hopefully be most cases where noise is evident in the future), we want to view HD material as it was delivered. When presented with poor post processing from a studio it is nice to have the ability to make our own decisions on how we want to view the content. These facts make it clear to us that the ability to enable or disable noise reduction is an imperative feature for video processors. While fully adjustable noise reduction might not be as necessary, it is absolutely appreciated and offers those who know what they are doing the highest potential image quality across every case.
Those who choose to stick with very well produced 1080p content may not need post processing noise reduction or deinterlacing, but they might miss out on imported content or HD releases of some TV series (depending on what studios choose to do in that area). For now, we're going to recommend that users interested in HTPC setups stick with the tools that can get the job done best no matter what the source material is. The only options for HD video intensive systems today are the Radeon HD 2600 and GeForce 8600 series cards. For its better handling of noise reduction (and especially the fact that it can be turned off) we recommend the 8600 GT/GTS above the other options in spite of the fact that the 2600 XT provided better CPU offloading.
We have to stress here that, in spite of the fact that NVIDIA and AMD expect the inclusion of video decode hardware on their low end hardware to provide significant value to end users, we absolutely cannot recommend current low end graphics card for use in systems where video decode is important. In our eyes, with the inability to provide a high quality HD experience in all cases, the HD 2400, GeForce 8500, and lower end hardware are all only suitable for use in business class or casual computing systems where neither games nor HD video play a part in the system's purpose.
AMD's UVD does beat out NVIDIA's VP2 in both H.264 and VC-1 decode performance. However, it isn't really enough to make a tangible difference in the viewing of movies. Performance is important, and UVD performance is certainly impressive. But we still have to favor the 8600 for its superior image quality.
VC-1 bitstream decoding doesn't have as large an impact as H.264 bitstream decoding. We would have to drop down to a significantly slower CPU in order for the difference to offer AMD an advantage. In the scenarios we tested, we feel that NVIDIA didn't make a serious blunder in skipping the inclusion of hardware to handle VC-1 bitstreams. At least, they didn't make as serious a blunder as AMD did by not including UVD in their HD 2900 XT.
In the future, we won't "need" H.264 or VC-1 decode on our GPUs either (just as we don't "need" MPEG-2 acceleration for current CPUs), but we don't see this as a valid excuse not to provide a full range of functionality for end users. And need is a relative term at best. We can do good realtime 3D on CPUs these days, but we don't see graphics card companies saying "this card will be paired with a high end CPU so we decided not to implement [insert key 3D feature] in hardware." We want to see AMD and NVIDIA include across the board support for video features in future product lineups. Saving CPU cycles isn't an exclusive desire of owners of low end hardware, and when we buy higher end hardware we expect higher performance.
63 Comments
View All Comments
DigitalFreak - Monday, July 23, 2007 - link
/ignoredrebo - Monday, July 23, 2007 - link
Willful ignorance, rose-colored glasses, selective blindness, et cetera.You have fun with that.
johnsonx - Tuesday, July 24, 2007 - link
Please go back to your fanboi sites then and don't bother us here.