Coming Soon to HD DVD: Silicon Optix HD HQV
by Derek Wilson on February 8, 2007 1:25 AM EST- Posted in
- GPUs
Introduction
In the past, when testing video playback features of PC graphics hardware, we have looked at the HQV benchmark by Silicon Optix. Over the years, HQV scores have improved, as we can see when comparing our first article on the subject to one written four months later. Current scores are nearly perfect on both NVIDIA and AMD hardware. But there is something lacking in these tests: they only provide insight into how hardware performs when handling standard definition content.
With the introduction of HD DVD and Blu-ray content, we have been waiting for a benchmark with which to test image quality of HD playback. Graphics hardware may ultimately have less of an impact on the HD viewing experience in the long run because media and players natively support 1080p, but it is still an important link in the chain. Interlaced media is available on both HD DVD and Blu-ray, and high quality deinterlacing at HD resolutions is just as important as it is on DVDs.
The benchmark not only looks at deinterlacing quality, but noise reduction as well. Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.
There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue.
We have wanted to play with an HD version of HQV for a while, and we are glad to have our hands on this early version. Before we take a look at just how the competition stacks up, we will look at the tests themselves and Silicon Optix scoring system.
In the past, when testing video playback features of PC graphics hardware, we have looked at the HQV benchmark by Silicon Optix. Over the years, HQV scores have improved, as we can see when comparing our first article on the subject to one written four months later. Current scores are nearly perfect on both NVIDIA and AMD hardware. But there is something lacking in these tests: they only provide insight into how hardware performs when handling standard definition content.
With the introduction of HD DVD and Blu-ray content, we have been waiting for a benchmark with which to test image quality of HD playback. Graphics hardware may ultimately have less of an impact on the HD viewing experience in the long run because media and players natively support 1080p, but it is still an important link in the chain. Interlaced media is available on both HD DVD and Blu-ray, and high quality deinterlacing at HD resolutions is just as important as it is on DVDs.
The benchmark not only looks at deinterlacing quality, but noise reduction as well. Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.
There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue.
We have wanted to play with an HD version of HQV for a while, and we are glad to have our hands on this early version. Before we take a look at just how the competition stacks up, we will look at the tests themselves and Silicon Optix scoring system.
27 Comments
View All Comments
bigpow - Monday, February 12, 2007 - link
I'd like to see more results, maybe from xbox 360 hd-dvd & toshiba HD-DVD players before I can be convinced that ATI & NVIDIA totally suckthestain - Sunday, February 11, 2007 - link
Suggest a redoianken - Friday, February 9, 2007 - link
...I meant that in the context of post processing. FWIW.ianken - Friday, February 9, 2007 - link
Since every HD DVD and BRD I've seen is authored at 1080p, I don't think 1080i film cadence support is that critical for either next-gen disc format.It is critical for HD broadcasts where 1080i content is derrived from telecined film or HD24p content and not flagged, which is very very common on cable and OTA feeds.
Noise reduction: just say no. It is NOT more important for HD. Noise reduction simply replaces random noise with deterministic noise and reduces true detail, I don't care how much magic is in there. With FUBAR analog cable is can make an unwatchable image moderalty palatable but keep it away from my HD-DVD, BRD content or broadcast HD.
On my 7800GTX I get film cadence detection and adaptive per-pixel vector deinterlace on 1080i. The problem you're seeing may be with the HD-DVD/decoder app failing to properly talk to the GPU. On XP they need to support proprietary APIs to get anything beyond base VMR deinterlacing, particlarly for HD. With Cyberlink there is even a "PureVideo" option in the menus for this. If they do not support PureVideoHD then you will get none of those advanced features on Nvidia hardware. Not sure what ATI does, but I do belive they only support film cadence and noise reduction on SD content.
peternelson - Friday, February 9, 2007 - link
"Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue. "
> Doing noise reduction at the player is less than ideal. You take noisy content then waste much of your datarate describing noise. The NR should be done as a PRE PROCESSING (as opposed to POST) step prior to feeding the encoder (not post processing as you suggest). Any movie studios making disks without NR are just lazy, and the customer deserves better. Obviously a generous bitrate and efficient encoding standard like mpeg4 are desirable, but you waste the benefit if you don't either noise-reduce it or have substantively no-noise content like CGI animation sequences from Pixar.
Thus the workflow ought to be Telecine scan data or digital intermediate eg 2K film res into colour correction into pan/scan cropping or aspect ratio conversion scaling (eg cinemascope into 16x9) then into noise reduction (statial and temporal etc) into encoder.
Done professionally different portions of the movie can be encoded with different processing parameters which kick in at the desired timecodes. These are often hand-optimised for sequences that can benefit from them. Such setups may be called ECL (encoder control lists) rather like EDL (edit decision lists).
Equipment to do excellent realtime noise reduction in high definition is readily available eg from Snell and Wilcox, and if you can't afford it you should either not be in the encoding business, or should be hiring it for the duration of the job from a broadcast hire supplier. Alternatively NR processing may be a feature of your telecine/datacine capture platform.
Ideally the encoded streams can be compared with the source material to identify any significant encoding artifacts like noticeable DCT macroblocking. This is basic QA and can be done in software and/or visually/manually.
If the NR is done by the studio prior to disk mastering, I see no reason to rely on the cheap and nasty NR in the player, and of course using a display capable of the proper bit depth and resolution will avoid quantisation banding and scaling degradation.
Poor attention to production values is diminishing the experience of what ought to be great content.
Contrary to your statement, noise reduction ought to have been used at standard definition too by anyone doing encoding professionally for DVDs etc. Even moderately expensive/affordable gear from FOR-A could do NR and colour correction using SDI digital ins and outs (that's if you can't afford the Snell gear LOL). The difference is certainly noticeable even before moving to HD content and bigger screens.
Not all noise reduction techniques reduce detail, particularly when done at the preprocessing stage. Taking noise out make more bits available for the denoised content to be described in MORE detail for equivalent bitrate. Clever algorithms are able to take out hairs from frames of movie film and replace with what ought to be there from adjacent frames (including using motion vector compensation). At this stage the maximum uncompressed source data is available on which to perform the processing whereas NR in the player suffers from only having the bit-constrained compressed material to recreate from. Other pre-processing might include removing camera shake (eg Snell Shakeout) so that compression bits are not wasted on spurious motion vectors where these are undesired. Genuine pans, zooms etc can be distinguised and still get encoded.
You rightly point out that video using deliberately added noise as simulation of film grain can be troublesome to encode, but there are several other techniques for making video appear film-like, eg Magic Bullet hardware or software as pioneered by The Orphanage which can do things like alter the gamma curve, and replicate various film lab processes like bleach bypass (like opening sequences of Saving Private Ryan).
DerekWilson - Sunday, February 11, 2007 - link
Thanks for the very informative post.I think we've got a bit of a miscommunication though ...
I'm not referring to post processing as post-encoding -- I'm referring to it as hollywood refers to it -- post-filming ... as in "fix it in post". You and I are referring to the same step in the overall scheme of things: after filming, before encoding.
It seems a bit odd that I hadn't heard anyone talk about processing from the perspective of the encoding step before, as a brief look around google shows that it is a very common way of talking about handling content pre and post encoding.
In any event, it may be that studios who don't do noise reduction are just lazy. Of course, you'd be calling most of them lazy if you say that. We agree that the customer deserves better, and currently they aren't getting it. Again, go pick up X-Men 3. Not that I liked the movie, but I certainly would have appreciated better image quality.
Does your statement "If the NR is done by the studio prior to disk mastering, I see no reason to rely on the cheap and nasty NR in the player" go the other way as well? If studios do not perform noise reduction (or, perhaps, adequate noise reduction) prior to mastering, is NR in the player useful?
I think it is -- but I do want to be able to turn it on and off at will.
Wesleyrpg - Thursday, February 8, 2007 - link
Read more like an advertisement for silicon optix than an article for Anandtech?The future of advertising? Buy an article?
JarredWalton - Thursday, February 8, 2007 - link
Hardly. People email us about all kinds of topics, and one of those has been HD video support. We've don't HQV image quality comparisons before, as have many websites, and it's not too surprising that NVIDIA and ATI decoder quality improved after many of the flaws were pointed out. It appears that there are plenty of flaws with the 1080i decoding now, and I'd bet that in the future it will be dramatically improved. We find the results to be useful - i.e. both ATI and NVIDIA are doing essentially nothing with HD video other than outputting it to the display. Now, readers will know that and maybe we'll see improvements. Not everyone cares about improving HD video quality, but for those that do this is good information to have.Wwhat - Sunday, February 11, 2007 - link
Well that's clearly not true, they both try to de-interlace the test shows, it's just not a good effort, so don't make such silly statements.
Wesleyrpg - Friday, February 9, 2007 - link
sorry jarred, i must of woken up on the wrong side of the bed this morning, i didnt mean to take it out on you guys. I love Anandtech, and may of been a bit confused with the article.Sorry again