We often neglect to get too involved in the discussion of what options people should always enable when they play games. Rather, we tend to focus on what we test with. Honestly, our recommended settings for playing the games we test would be very similar to the settings we use to benchmark with one very important exception: we would enable triple buffering (which implies vsync) whenever possible. While it's not an available option in all games, it really needs to be, and we are here to make the case for why gamers should use triple buffering and why developers need to support it.
Most often gamers, when it comes to anything regarding vsync, swear by forcing vsync off in the driver or disabling it in the game. In fact, this is what we do when benchmarking because it allows us to see more clearly what is going on under the hood. Those who do enable vsync typically do so to avoid the visual "tearing" that can occur in some cases despite the negative side effects.
We would like to try something a little different with this article. We'll include two polls, one here and one at the end of the article. This first poll is designed to report what our readers already do with respect to vsync and double versus triple buffering.
{poll 134:300}
After reading the rest of this article, our readers are invited to answer a related poll which is designed to determine if arming gamers with the information this article provides will have any impact on what settings are used from here on out.
First up will be a conceptual review of what double buffering and vsync are, then we'll talk about what triple buffering brings to the table. For those who really want the nitty gritty (or who need more convincing) we will provide follow that up with a deeper dive into each approach complete with some nifty diagrams.
184 Comments
View All Comments
davidri - Sunday, July 26, 2009 - link
I'll just stick to vsync on with whatever the default frame buffer is in the Nvidia control panel. I get very good performance (most games I play run at 60fps) and no vertical tearing. I don't care to deal with third party apps.griffhamlin - Thursday, July 16, 2009 - link
you can force tripple buffering in DX . some appz exist for that.D3DOverrider , for one...
Muhammed - Wednesday, July 8, 2009 - link
I must say , Excellent article up to the page number 2 , after that things started to get REAL messy .I don't consider myself too stupid , nor too genius , but I am confident I am smart , and everything was fine till the second page , where you explained the principles of the idea , I quickly understood it just from one concentrated read , but the horses example is simply HORRIBLE , I understand you didn't want to waste 9 pages on a simple thing like V-Sync , hence so you wrapped up the concept quickly , but this has left us readers really confused .
Firstly , you started slow (page 2), elaborating on every little detail , then you provided an example that should make the picture even clearer , but on the contrary .. you put a lot of possibilities and new concepts into this example , and you successfully made it MORE COMPLEX , instead of being SIMPLER .
Secondly , horrible elaboration in the example made it even more convoluted , adding the complexity into the equation = HORRIBLE Example .
I am waiting for a follow up article .. one with even 18 pages , I will read them all .. every last letter , for this is the price of knowledge , just remember SIMPLIFY and ELABORATE .
Thanks you for your understanding .
quarup - Tuesday, July 7, 2009 - link
The following seems confusion, could you please clarify or reword it:"In double buffering, this happens with every frame even if the next frames done after the monitor is finished receiving and drawing the current frame (meaning that it might not be displayed at all if another frame is completed before the next refresh)."
It sounds like it says two contradicting things about double-buffering + vsync:
1. a swap buffer happens once per frame
2. a frame might be skipped if we're rendering frames too fast (this sounds more like triple buffering?)
Also:
"With triple buffering, front buffer swaps only happen at most once per vsync."
Isn't this true with double buffer + vsync, too?
pakotlar - Sunday, July 5, 2009 - link
t.buffering is great, but tighter integration between the abstraction layer and developer tools along with general programming protocols on GPU's, should allow (maybe with the use of a dynamic LOD system like SPVO) should kill the need for t.buffering or vsync. there has to be a better solution in place today for homogenous hardware.happymanz - Saturday, July 4, 2009 - link
Hi,I mostly play games at 1024x768@120hz if they are non competative, and 800x600\640x480@160hz if they are competative. (I am not able to notice any tearing at 160hz)
What settings are recommended for gamers using CRT or 120hz LCD monitors? (most people will not notice any tearing even at 120hz)
Alot of older (and still popular) games run various versions of the quakeengine where physics are affected by the FPS (I'm no expert on the matter)
(I have yet to see any LCD monitor getting close in terms of imagequality, and so far it seems you cant have your cake and eat it aswell when it comes to different types of panels)
urebelscum - Thursday, July 2, 2009 - link
Nice article; I loved seeing the example. However, from reading all the commons, I think a follow up article is needed. First, another example is needed: when rendering is slower than monitor refresh. I thought I pictured what would happen, but now I'm not sure. Maybe another example, covering what happens when a game drops below the 60 fps threashhold, but if the other example is clear, maybe not. The rest of the followup should include more info: basically add render ahead to the first example, and a list of which games use true triple buffer, and which use mis-named render ahead.The last is why I still don't use "triple buffering" all the time. It seems most games I play are calling render ahead the wrong name, so I leave the wrongly called "triple buffering" "disabled".
Two things that probably are beyond the scope are: how to tell if a game is using true triple buffering or if it's using render ahead, and what devs need to do to use true triple buffering. (I'm following a couple open source games that say they support triple buffering, but might be using render ahead.)
DerekWilson - Thursday, July 2, 2009 - link
Thanks for the feedback. I'm looking into the possibility of a follow up and appreciate your suggestions of things to look into.castanza - Tuesday, June 30, 2009 - link
Enabling triple buffering whenever possible is not the right idea.Again, the choices are:
1) double buffer w/o vsync
2) double buffer w/ vsync
3) triple buffer w/ vsync
Suppose your screen refreshes @ 60 Hz (pretty common now).
The key question is: can your machine pump out 60 fps consistently for the game in question?
If it can, then you probably want double buffer w/ vsync. I enabled this setting in L4D and I can enjoy NO TEARING with minimum lag because L4D gives very nice frame rates on my machine, not often dipping below 60 fps.
If it can't, then your choice depends on how well you can tolerate lag in this particular game. In some games you may not notice it, in others you will. I find it less noticeable in driving games than fps games for example. I tried this setting in L4D, and I found the added lag unacceptable. Anyway if you don't mind a bit of extra input lag in this particular game, then you want triple buffer w/ vsync. Otherwise, if getting rid of lag is more important than eliminating tearing, you'll choose double buffer w/o vsync.
That's my $0.02 on this subject :)
Hrel - Monday, June 29, 2009 - link
Sounds to me like the best option would be a tripple buffer with a rendering que; coupled with a rendered frames management feature.From this, I THINK the to get the best image a COMPLETED frame should be shown every single refresh of the monitor.
And in order to reduce lag as much as possible, the most recent fully rendered frame should be put out to the monitor; and all the older frames should be thrown out. Which means skipping could occur but with proper management it should be so minor that we really won't notice.
Especially as monitor refresh rates go up to 120HZ and beyond.
Comments anyone???