Coming Soon to HD DVD: Silicon Optix HD HQV
by Derek Wilson on February 8, 2007 1:25 AM EST- Posted in
- GPUs
HD HQV Performance
For these tests, we will be using a beta version of PowerDVD HD 6.5 along with hardware acceleration to play back the HD HQV HD DVD. This works much the same as it did with the standard definition HQV test which is a collection of clips playback from a DVD. The software DVD, HD DVD or Blu-ray player, with hardware acceleration enabled, will use the graphics hardware to offload the decode process from the CPU. The byproduct of this is that image quality is more influenced by the hardware being used than the software when using hardware acceleration.
For AMD we are using Catalyst 7.1, and for NVIDIA we look at 93.71 for the 7 series and 97.92 for the 8 series. Our results on both 7 and 8 series cards for NVIDIA are the same, so there will be no distinction between the two when talking about NVIDIA results. This comparison is as much between AVIVO and PowerDVD HD as it is of the hardware at this point. Both AMD and NVIDIA should have some headroom for improving performance through their drivers in the future.
The maximum score for HD HQV is 105 with 80 of those points having to do with proper deinterlacing of interlaced HD sources. All broadcast HD sources in the US today are interlaced, and there are many HD DVD movies provided in 1080i as well. Fewer Blu-ray titles are 1080i, but they aren't impossible to find. While our HD DVD version of HQV obviously tests HD DVD players, these features will be important for the playback of any interlaced HD source. Because of this, we really expected NVIDIA and AMD to perform well.
Unfortunately, reality did not live up to our expectations. We'll break it down by test. While we would love to provide screenshots, our version of PowerDVD doesn't support screenshots, and taking pictures of the TV just doesn't provide the detail we need. Descriptions of what's going on will have to do for now.
Noise Reduction
Both AMD and NVIDIA score a flat zero on this test. None of the AMD or NVIDIA cards we tested performed any noise reduction on either the flowers or the boat scene. There weren't any artifacts present, but it is very clear that neither camp performs any noise reduction on HD video at this point.
Video Resolution Loss
AMD averages fields and thus looses detail. The result is a gray color filling the corner blocks rather than alternating fine lines. NVIDIA doubles the scanlines in one field and eliminates half of the data, as the corner blocks are solid colors. This means that both solutions fail in different ways. PowerDVD's software performs similarly to AMD hardware, which means that currently available computer hardware and software will not faithfully reproduce interlaced video.
Jaggies
Once again, AMD and NVIDIA both fail to eliminate diagonal aliasing. This is another example of the poor deinterlacing provided by computer hardware and current drivers. Eliminating jaggies is a major way to improve the visual experience of watching interlaced video on a progressive display like a 720p or 1080p HDTV or a computer monitor.
Film Resolution Loss
Like the video resolution loss test, both NVIDIA and AMD fail this test. The failure was a result of the same problems we saw in the video resolution loss test, meaning that rather than performing inverse telecine, both AMD and NVIDIA treat 1080i created from a film source the same way they would treat video. For AMD this means averaging fields, and for NVIDIA this means eliminating half the fields.
Film Resolution Loss - Stadium Test
When playing on AMD hardware, flickering is apparent in the stadium. While NVIDIA hardware doesn't flicker, a moiré pattern is apparent in the stands. Both of these fail to pass the test and demonstrate different issues that can appear when film is poorly deinterlaced.
The overall result?
AMD: 0
NVIDIA: 0
For these tests, we will be using a beta version of PowerDVD HD 6.5 along with hardware acceleration to play back the HD HQV HD DVD. This works much the same as it did with the standard definition HQV test which is a collection of clips playback from a DVD. The software DVD, HD DVD or Blu-ray player, with hardware acceleration enabled, will use the graphics hardware to offload the decode process from the CPU. The byproduct of this is that image quality is more influenced by the hardware being used than the software when using hardware acceleration.
For AMD we are using Catalyst 7.1, and for NVIDIA we look at 93.71 for the 7 series and 97.92 for the 8 series. Our results on both 7 and 8 series cards for NVIDIA are the same, so there will be no distinction between the two when talking about NVIDIA results. This comparison is as much between AVIVO and PowerDVD HD as it is of the hardware at this point. Both AMD and NVIDIA should have some headroom for improving performance through their drivers in the future.
The maximum score for HD HQV is 105 with 80 of those points having to do with proper deinterlacing of interlaced HD sources. All broadcast HD sources in the US today are interlaced, and there are many HD DVD movies provided in 1080i as well. Fewer Blu-ray titles are 1080i, but they aren't impossible to find. While our HD DVD version of HQV obviously tests HD DVD players, these features will be important for the playback of any interlaced HD source. Because of this, we really expected NVIDIA and AMD to perform well.
Unfortunately, reality did not live up to our expectations. We'll break it down by test. While we would love to provide screenshots, our version of PowerDVD doesn't support screenshots, and taking pictures of the TV just doesn't provide the detail we need. Descriptions of what's going on will have to do for now.
Noise Reduction
Both AMD and NVIDIA score a flat zero on this test. None of the AMD or NVIDIA cards we tested performed any noise reduction on either the flowers or the boat scene. There weren't any artifacts present, but it is very clear that neither camp performs any noise reduction on HD video at this point.
Video Resolution Loss
AMD averages fields and thus looses detail. The result is a gray color filling the corner blocks rather than alternating fine lines. NVIDIA doubles the scanlines in one field and eliminates half of the data, as the corner blocks are solid colors. This means that both solutions fail in different ways. PowerDVD's software performs similarly to AMD hardware, which means that currently available computer hardware and software will not faithfully reproduce interlaced video.
Jaggies
Once again, AMD and NVIDIA both fail to eliminate diagonal aliasing. This is another example of the poor deinterlacing provided by computer hardware and current drivers. Eliminating jaggies is a major way to improve the visual experience of watching interlaced video on a progressive display like a 720p or 1080p HDTV or a computer monitor.
Film Resolution Loss
Like the video resolution loss test, both NVIDIA and AMD fail this test. The failure was a result of the same problems we saw in the video resolution loss test, meaning that rather than performing inverse telecine, both AMD and NVIDIA treat 1080i created from a film source the same way they would treat video. For AMD this means averaging fields, and for NVIDIA this means eliminating half the fields.
Film Resolution Loss - Stadium Test
When playing on AMD hardware, flickering is apparent in the stadium. While NVIDIA hardware doesn't flicker, a moiré pattern is apparent in the stands. Both of these fail to pass the test and demonstrate different issues that can appear when film is poorly deinterlaced.
The overall result?
AMD: 0
NVIDIA: 0
27 Comments
View All Comments
bigpow - Monday, February 12, 2007 - link
I'd like to see more results, maybe from xbox 360 hd-dvd & toshiba HD-DVD players before I can be convinced that ATI & NVIDIA totally suckthestain - Sunday, February 11, 2007 - link
Suggest a redoianken - Friday, February 9, 2007 - link
...I meant that in the context of post processing. FWIW.ianken - Friday, February 9, 2007 - link
Since every HD DVD and BRD I've seen is authored at 1080p, I don't think 1080i film cadence support is that critical for either next-gen disc format.It is critical for HD broadcasts where 1080i content is derrived from telecined film or HD24p content and not flagged, which is very very common on cable and OTA feeds.
Noise reduction: just say no. It is NOT more important for HD. Noise reduction simply replaces random noise with deterministic noise and reduces true detail, I don't care how much magic is in there. With FUBAR analog cable is can make an unwatchable image moderalty palatable but keep it away from my HD-DVD, BRD content or broadcast HD.
On my 7800GTX I get film cadence detection and adaptive per-pixel vector deinterlace on 1080i. The problem you're seeing may be with the HD-DVD/decoder app failing to properly talk to the GPU. On XP they need to support proprietary APIs to get anything beyond base VMR deinterlacing, particlarly for HD. With Cyberlink there is even a "PureVideo" option in the menus for this. If they do not support PureVideoHD then you will get none of those advanced features on Nvidia hardware. Not sure what ATI does, but I do belive they only support film cadence and noise reduction on SD content.
peternelson - Friday, February 9, 2007 - link
"Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue. "
> Doing noise reduction at the player is less than ideal. You take noisy content then waste much of your datarate describing noise. The NR should be done as a PRE PROCESSING (as opposed to POST) step prior to feeding the encoder (not post processing as you suggest). Any movie studios making disks without NR are just lazy, and the customer deserves better. Obviously a generous bitrate and efficient encoding standard like mpeg4 are desirable, but you waste the benefit if you don't either noise-reduce it or have substantively no-noise content like CGI animation sequences from Pixar.
Thus the workflow ought to be Telecine scan data or digital intermediate eg 2K film res into colour correction into pan/scan cropping or aspect ratio conversion scaling (eg cinemascope into 16x9) then into noise reduction (statial and temporal etc) into encoder.
Done professionally different portions of the movie can be encoded with different processing parameters which kick in at the desired timecodes. These are often hand-optimised for sequences that can benefit from them. Such setups may be called ECL (encoder control lists) rather like EDL (edit decision lists).
Equipment to do excellent realtime noise reduction in high definition is readily available eg from Snell and Wilcox, and if you can't afford it you should either not be in the encoding business, or should be hiring it for the duration of the job from a broadcast hire supplier. Alternatively NR processing may be a feature of your telecine/datacine capture platform.
Ideally the encoded streams can be compared with the source material to identify any significant encoding artifacts like noticeable DCT macroblocking. This is basic QA and can be done in software and/or visually/manually.
If the NR is done by the studio prior to disk mastering, I see no reason to rely on the cheap and nasty NR in the player, and of course using a display capable of the proper bit depth and resolution will avoid quantisation banding and scaling degradation.
Poor attention to production values is diminishing the experience of what ought to be great content.
Contrary to your statement, noise reduction ought to have been used at standard definition too by anyone doing encoding professionally for DVDs etc. Even moderately expensive/affordable gear from FOR-A could do NR and colour correction using SDI digital ins and outs (that's if you can't afford the Snell gear LOL). The difference is certainly noticeable even before moving to HD content and bigger screens.
Not all noise reduction techniques reduce detail, particularly when done at the preprocessing stage. Taking noise out make more bits available for the denoised content to be described in MORE detail for equivalent bitrate. Clever algorithms are able to take out hairs from frames of movie film and replace with what ought to be there from adjacent frames (including using motion vector compensation). At this stage the maximum uncompressed source data is available on which to perform the processing whereas NR in the player suffers from only having the bit-constrained compressed material to recreate from. Other pre-processing might include removing camera shake (eg Snell Shakeout) so that compression bits are not wasted on spurious motion vectors where these are undesired. Genuine pans, zooms etc can be distinguised and still get encoded.
You rightly point out that video using deliberately added noise as simulation of film grain can be troublesome to encode, but there are several other techniques for making video appear film-like, eg Magic Bullet hardware or software as pioneered by The Orphanage which can do things like alter the gamma curve, and replicate various film lab processes like bleach bypass (like opening sequences of Saving Private Ryan).
DerekWilson - Sunday, February 11, 2007 - link
Thanks for the very informative post.I think we've got a bit of a miscommunication though ...
I'm not referring to post processing as post-encoding -- I'm referring to it as hollywood refers to it -- post-filming ... as in "fix it in post". You and I are referring to the same step in the overall scheme of things: after filming, before encoding.
It seems a bit odd that I hadn't heard anyone talk about processing from the perspective of the encoding step before, as a brief look around google shows that it is a very common way of talking about handling content pre and post encoding.
In any event, it may be that studios who don't do noise reduction are just lazy. Of course, you'd be calling most of them lazy if you say that. We agree that the customer deserves better, and currently they aren't getting it. Again, go pick up X-Men 3. Not that I liked the movie, but I certainly would have appreciated better image quality.
Does your statement "If the NR is done by the studio prior to disk mastering, I see no reason to rely on the cheap and nasty NR in the player" go the other way as well? If studios do not perform noise reduction (or, perhaps, adequate noise reduction) prior to mastering, is NR in the player useful?
I think it is -- but I do want to be able to turn it on and off at will.
Wesleyrpg - Thursday, February 8, 2007 - link
Read more like an advertisement for silicon optix than an article for Anandtech?The future of advertising? Buy an article?
JarredWalton - Thursday, February 8, 2007 - link
Hardly. People email us about all kinds of topics, and one of those has been HD video support. We've don't HQV image quality comparisons before, as have many websites, and it's not too surprising that NVIDIA and ATI decoder quality improved after many of the flaws were pointed out. It appears that there are plenty of flaws with the 1080i decoding now, and I'd bet that in the future it will be dramatically improved. We find the results to be useful - i.e. both ATI and NVIDIA are doing essentially nothing with HD video other than outputting it to the display. Now, readers will know that and maybe we'll see improvements. Not everyone cares about improving HD video quality, but for those that do this is good information to have.Wwhat - Sunday, February 11, 2007 - link
Well that's clearly not true, they both try to de-interlace the test shows, it's just not a good effort, so don't make such silly statements.
Wesleyrpg - Friday, February 9, 2007 - link
sorry jarred, i must of woken up on the wrong side of the bed this morning, i didnt mean to take it out on you guys. I love Anandtech, and may of been a bit confused with the article.Sorry again