HD Video Decode Quality and Performance Summer '07
by Derek Wilson on July 23, 2007 5:30 AM EST- Posted in
- GPUs
Yozakura (High Complexity H.264) Performance
H.264 offers quite a range of options, and we haven't seen everyone taking advantage of some of the more advanced features. Yozakura is encoded in 1080i at 25 Mbps. This is fairly low for H.264 maximums, but this is still very CPU intensive because the video is encoded using macroblock adaptive frame/field (MBAFF) coding. MBAFF is a high quality technique to ensure maximum visual fidelity in interlaced video by adaptively selecting frame or field encoding per macroblock based on a motion threshold.
While 1080p is clearly Hollywood's choice of resolution, there is 1080i encoded content out there now and more likely to come. As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option. It's nice to know that H.264 offers high quality interlaced HD encoding options, and we hope content authors who decide to release their creations in 1080i will take advantage of things like MBAFF.
Additionally, good deinterlacing is essential for getting a good experience with movies like this. Poorly deinterlaced HD content is not only sad to watch, but gives this author quite a headache. Jaggies and feathering are horrible distractions at this resolution. As long as you stick with an HD 2600 or GeForce 8600 series or higher you should be fine here. Any slower just won't cut it when trying to watch 1080i on a progressive scan display.
Our high end CPU is able to cope fairly well, with the 8800 GTX besting the 2900 XT in performance while UVD leads VP2 putting the 2600 XT ahead of the 8600 GTS.
For our cheap yet current processor, we do see utilization go up, but the hardware with bitstream decoding maintains very low overhead. All of our GPUs maintain good performance when paired with this level of processor. Of course, we would likely not see the high end GPUs matched with such a CPU (unless we are looking at notebooks, but that's a whole other article).
For our older hardware, Yozakura is simply not watchable without bitstream decoding. With the numbers for the high end AMD and NVIDIA GPUs even worse than under our Transporter 2 trailer test, it's clear that NetBurst does not like whatever Yozakura is doing. It may be that decoding the bitstream when MBAFF is used is branch heavy causing lots of stalls. All we can say for sure is that, once again, GPU accelerated bitstream decoding is necessary to watch H.264 content on older/slower hardware.
H.264 offers quite a range of options, and we haven't seen everyone taking advantage of some of the more advanced features. Yozakura is encoded in 1080i at 25 Mbps. This is fairly low for H.264 maximums, but this is still very CPU intensive because the video is encoded using macroblock adaptive frame/field (MBAFF) coding. MBAFF is a high quality technique to ensure maximum visual fidelity in interlaced video by adaptively selecting frame or field encoding per macroblock based on a motion threshold.
While 1080p is clearly Hollywood's choice of resolution, there is 1080i encoded content out there now and more likely to come. As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option. It's nice to know that H.264 offers high quality interlaced HD encoding options, and we hope content authors who decide to release their creations in 1080i will take advantage of things like MBAFF.
Additionally, good deinterlacing is essential for getting a good experience with movies like this. Poorly deinterlaced HD content is not only sad to watch, but gives this author quite a headache. Jaggies and feathering are horrible distractions at this resolution. As long as you stick with an HD 2600 or GeForce 8600 series or higher you should be fine here. Any slower just won't cut it when trying to watch 1080i on a progressive scan display.
Our high end CPU is able to cope fairly well, with the 8800 GTX besting the 2900 XT in performance while UVD leads VP2 putting the 2600 XT ahead of the 8600 GTS.
For our cheap yet current processor, we do see utilization go up, but the hardware with bitstream decoding maintains very low overhead. All of our GPUs maintain good performance when paired with this level of processor. Of course, we would likely not see the high end GPUs matched with such a CPU (unless we are looking at notebooks, but that's a whole other article).
For our older hardware, Yozakura is simply not watchable without bitstream decoding. With the numbers for the high end AMD and NVIDIA GPUs even worse than under our Transporter 2 trailer test, it's clear that NetBurst does not like whatever Yozakura is doing. It may be that decoding the bitstream when MBAFF is used is branch heavy causing lots of stalls. All we can say for sure is that, once again, GPU accelerated bitstream decoding is necessary to watch H.264 content on older/slower hardware.
63 Comments
View All Comments
Wozza - Monday, March 17, 2008 - link
"As TV shows transition to HD, we will likely see 1080i as the choice format due to the fact that this is the format in which most HDTV channels are broadcast (over-the-air and otherwise), 720p being the other option."I would like to point out that 1080i has become a popular broadcast standard because of it's lower broadcast bandwidth requirements. TV shows are generally mastered on 1080p, then 1080i dubs are pulled from those masters and delivered to broadcasters (although some networks still don't work with HD at all, MTV for instance who take all deliveries on Digital Beta Cam). Pretty much the only people shooting and mastering in 1080i are live sports, some talk shows, reality TV and the evening news.
Probably 90% of TV and film related blu-rays will be 1080p.
redpriest_ - Monday, July 23, 2007 - link
Hint: They didn't. What anandtech isn't telling you is that NO nvidia card supports HDCP over dual-DVI, so yeah, you know that hot and fancy 30" LCD with gorgeous 2560x1600 res? You need to drop it down to 1280x800 to get it to work with an nvidia solution.This is a very significant problem, and I for one applaud ATI for including HDCP over dual-DVI.
DigitalFreak - Wednesday, July 25, 2007 - link
Pwnd!defter - Tuesday, July 24, 2007 - link
You are wrong.Check Anand's 8600 review, they clearly state that 8600/8500 cards support HDCP over dual-DVI.
DigitalFreak - Monday, July 23, 2007 - link
http://guru3d.com/article/Videocards/443/5/">http://guru3d.com/article/Videocards/443/5/http://guru3d.com/article/Videocards/443/5/Chadder007 - Monday, July 23, 2007 - link
I see the ATI cards lower CPU usage, but how is the power readings when the GPU is being used compared to the CPU??chris92314 - Monday, July 23, 2007 - link
Does the HD video acceleration work with other programs, and with non blueray/hddvd sources? For example if I wanted to watch a h.264 encoded .mkv file would I still see the performance and image enhancements.GPett - Monday, July 23, 2007 - link
Well, what annoys me is that there used to be all-in-wonder video cards for this kinda stuff. I do not mind a product line that has TV tuners and HD playback codecs, but not at the expense of 3d performance.It is a mistake for ATI and Nvidia to try to include this stuff on all video cards. The current 2XXX and 8XXX generation of video cards might not been as pathetic had the two GPU giants focused on actually making a GPU good instead of adding features that not everyone wants.
I am sure lots of people watch movies on their computer. I do not. I don't want a GPU with those features. I want a GPU that is good at playing games.
autoboy - Wednesday, July 25, 2007 - link
All in wonder cards are a totally different beast. The all in wonder card was simply a combination of a TV tuner card (and a rather poor one) and a normal graphics chip. The TV tuner simply records TV and has nothing to do with playback. ATI no longer sells All in wonder cards because the TV tuner card did not go obsolete quickly, while the graphics core in the AIW card went obsolete quickly, requiring the buyer to buy another expensive AIW card when only the graphics part was obsolete. A separate tuner card made so much more sense.Playback of video is a totally different thing and the AIW cards performed exactly the same as regular video cards based on the same chip. At the time, playing video on the PC was more rare and the video playback of all cards was essentially the same because no cards offered hardware deinterlacing on their video cards. Now, video on the PC is abundant and is the new Killer App (besides graphics) which drives PC performance, storage, and internet speed. Nvidia was first to the party offering Purevideo support, which did hardware deinterlacing for DVDs and SD TV on the video card instead of in software. It was far superior to any software solution at the time (save a few diehard fans of Dscaler with IVTC) and came out at exactly the right time, with the introduction of media center and cheap TV tuner cards and HD video. Now, Purevideo 2 and AVIVO HD introduce the same high quality deinterlacing to HD video for mpeg2 (7600GT and up could do HD mpeg2 deinterlacing) as well as VC-1 and H.264 content. If you don't think this is important, remember that all new satelite HD broadcasts coming online are in 1080i h264, requiring deinterlacing to look its best, and new products are coming and exist already if you are willing to work for it, that allow you to record this content on your computer. Also, new TV series are likely to be released in 1080i on HD discs because that is their most common broadcast format. If you don't need this fine, but they sell a lot of cards to people who do.
autoboy - Wednesday, July 25, 2007 - link
Oh, I forgot to mention that only the video decode acceleration requires extra transistors, the deinterlacing calculations are done on the programable shaders of the cards requiring no additional hardware, just extra code in the drivers to work. The faster the video card, the better your deinterlacing, which explains why the 2400 and the 8500 cannot get perfect scores on the HQV tests. You can verify this on HD 2X00 cards by watching the GPU% in Riva Tuner while forcing different adaptive deinterlacing in CCC. This only works in XP btw.