NVIDIA Enables PureVideo on GeForce 6 GPUs
by Anand Lal Shimpi on December 20, 2004 1:22 PM EST- Posted in
- GPUs
DVD Playback Quality
Now that we've laid the background information, it's time to look at DVD playback quality. Although NVIDIA provided us with around 700MB of test data, we took it upon ourselves to put together our own test suite for image quality comparisons. We used some tests that have been used in the home theater community as de-interlacing benchmarks, as well as others that we found to be particularly good measures of image quality.
For all of our quality tests we used Zoom Player Pro, quite possibly one of the most feature filled media players available.
Our first set of tests are Secrets of Home Theater and High Fidelity tests. The Galaxy Quest theatrical trailer isn't flagged at all and relies entirely on the DVD decoder's algorithms for proper de-interlacing. The default image below is ATI's X700 Pro, mouse over it to see NVIDIA's PureVideo enabled 6600GT:
Hold mouse over image to see NVIDIA's Image Quality
NVIDIA offers a huge advantage here, the interlacing artifacts that are present in the ATI image are no where to be found in the NVIDIA image.
Next up we have The Making of Apollo 13 documentary off of the Apollo 13 DVD. Often times bonus materials on DVDs aren't properly encoded and trip up DVD decoders, let's see how ATI and NVIDIA fair here. The default image below is ATI, mouse over the image to see NVIDIA.
Hold mouse over image to see NVIDIA's Image Quality
NVIDIA once again takes the lead here; notice the combing artifacts on the man's suit coat, they are not present with NVIDIA's solution.
Our final test here is from the Making of the Big Lebowski off of the Big Lebowski DVD. The scene here is "The Jesus" licking a bowling ball, first let's have a look at what the scene is supposed to look at just before it transitions to another frame:
Now let's have a look at how ATI and NVIDIA display the scene:
Hold mouse over image to see NVIDIA's Image Quality
Neither ATI or NVIDIA pass the Big Lebowski test, what went wrong here? The correct image above was generated by using a software decoder (DScaler 5) and forcing "bob" de-interlacing, which uses none of the data from the next field in constructing the current frame. The reason this works is because this particular scene causes most DVD decoders to incorrectly weave two fields together from vastly different scenes, resulting in the artifacts seen above. It's quite disappointing that neither ATI nor NVIDIA are able to pass this test as it is one of the most visible artifacts of poor de-interlacing quality.
62 Comments
View All Comments
Novaoblivion - Monday, December 20, 2004 - link
This is pretty interesting and since I already bought the Nvidia DVD Decoder I can upgrade to this new version if the link on Nvidia's site ever starts working lol.jonny13 - Monday, December 20, 2004 - link
"Considering that PureVideo came as a free feature on GeForce 6 cards"How is paying $20 for the damn codec free?