I don't get it: Fixed Framerate Tear VS V-Sync Lag

This forum is currently in read-only mode.
From the Asset Store
"Satellite Sync" is an engaging puzzle game set in the vastness of space.
  • Okay, I feel like I have some misunderstand, or am missing something about the concept behind Fixed Framerate and V-Sync, because something doesn't make sense here.

    When we set a Fixed Framerate, as I understand it now, we basically set the speed of the repetition of the processing & rendering cycles for the computer, which is why there is no lag.

    When we set V-Sync, it is possible to get lag, if the processing and rendering is much faster than the refreshrate of the screen. I don't understand this. That means V-Sync doesn't set the speed of the repetition of the processing cycles, but actually just let's it wait and only allow to display new images, when the screen is finished with the old one? Why doesn't it just work as fixed framerate, just set to the refreshrate of the screen then?

    Now let me define lag. Lag means executing an input, but having the related action on screen appear delayed, not on time with the input.

    Now if it is that way, I don;t understand another thing. If the lag is due to the speed difference, it should theoretically get worse with time. Let's say the refreshrate of the screen is 60Hz and the speed of the processing is 120FPS, twice as fast. So let's see, it'd go like this, or not? -> 1st Second- Screen(showing): 2 Frames; PC(has): 4 Frames; 2nd Second- Screen: 4 Frames; PC: 8 Frames; 3rd Second - Screen: 6 Frames; PC: 12 Frames.

    In other words, in this example in the beginning we have 2 frames difference, and on after 3 seconds, we have 6 frames lagging behind.

    Why doesn't anything here make sense to what I see?

    I just recently tried to read up on that again, to see what causes screen tearing. Which (interestingly enough), I understand how it happens, but it never happens, even if I make an experiment on my laptop. I know its' refreshrate is 60 Hz, but no matter what fixed framerate I put, it never tears. So I don't get it again.

    I feel like I missed something.

    Thank You for any help I could get clarifying this.

  • When V-sync is off, the screen is drawn immediately whenever it's ready. This can be half-way through the monitor scan, which means you get tearing, but the display is always updated immediately (ie. no lag).

    When V-sync is on, the screen waits until the monitor scan is finished before drawing. This prevents tearing, but fundamentally requires it wait until the monitor is ready, which is the lag you see. I can't remember how Classic works exactly, but there might be one or two other things at work (some kind of buffering) which introduce one or two frame lags as well, which is more noticable in V-sync mode.

    So unfortunately you can't have the best of both worlds - either you introduce a delay to fix tearing, or you have tearing.

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • Thank You for the fast reply, Ashley.

    But if V-Sync works that way, shouldn't the lag theoretically get worse each second then? Like I wrote in my example with the counting of the second?

    In other words, reading of the code is not affected by it. So I might press a button, the program realizes it exactly on time, as fast as it can, but the graphics card draws the necessary frames to the buffer, and sends them to the screen only as it is ready.

    To my current understanding, since we have two things basically running at different speeds, the distance between the two will get bigger with time. Like in my example.

    So the frames lagging behind will get more with time, but it doesn't happen in reality. That's what I don't understand.

    Indeed, why not just limit the speed of the processing cycle of the events? So it will only always check the list of events/code with a time interval given by the the Hz of the screen refreshrate. Shouldn't that solve the problem very easily?

    I mean I just don't understand if fixed framerate has no problem with lag the way it works, why doesn't the program just recognize the users screen's refreshrate and then set the fixed refreshrate to that of the screen.

    In that case I don't even see the use of v-sync, it seems very limiting.

  • But if V-Sync works that way, shouldn't the lag theoretically get worse each second then? Like I wrote in my example with the counting of the second?

    In other words, reading of the code is not affected by it.

    That's not true. Code execution and rendering are synchronized. While the graphic card renders the actual frame, the code executes the events for the next frame - and then waits until the graphic card is ready for the next rendering. When talking of buffering on the software site, we are talking of single or double buffering at most.

    If you experience significantly lag, it might be another issue. Today's graphic cards do also buffer frames (they do it to let movement appear smoother when the frame rate is low). For example, NVIDIA cards are set to buffer at least 3 frames by default. You could check if setting such a card's buffer to a lower value helps.

  • Think of it like a flip book.

    Fixed would be allowing the pages to go by as quick as possible, even if a few pages get by every now and then.

    V-sync would be letting the pages go by a little slower, so that you can make sure that none of the pages are missed.

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)