[r132] "Every x seconds" isn't accurate

0 favourites
  • 4 posts
  • Link to .capx file (required!):


    Steps to reproduce:

    1. Have two different variables.

    2. One one of them, add 1 "Every 0.01"

    3. On the other variable, add 1 "Every 0.1"

    Observed result:

    "Every 0.1" and "Every 0.01" give different time results when using them. It appears that "Every 0.01" will count slower than "Every 0.1".

    When adding 0.01, the 3RD digit holds the Seconds, whilst adding 0.1, the 2ND digit is the seconds

    Also, whilst setting up this bug, I decided to add a second timer method that also appears to be completely out of sync. The second timer method is adding dt to a variable ("dt3"), and when "dt3" is >= 0.1 , it will add 1 to the miliseconds and reset "dt3" back to 0.

    This should logically add time up in the same way as the "Every" event, but this seems to not stay in sync with the Every events.

    There is also "dt4" checking if it reaches >=0.01

    I found this whilst trying to make a visual timer using sprites, I thought it was to do with the ordering of events, but found that I couldn't have accurate milliseconds if I add 1 millisecond (0.01) and can only add 10 milliseconds (0.1) to avoid being out of sync.

    Expected result:

    "Every 0.01" should be keeping synced with time.

    Browsers affected:

    Chrome: yes

    Firefox: yes

    Operating system & service pack:

    Windows 7 Home 64-bit

    Construct 2 version:


  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • Yeah you really shouldn't expect that kind of accuracy with javascript.

    Beyond that browsers will vary wildly, and you would be lucky to get accurate seconds let alone milliseconds on mobiles.

  • Closing as not a bug; you cannot expect this precision with any game engine.

    Most games run at 60 FPS, which means each frame takes about 16 ms. The game logic runs every 16 ms as well. If you say "Every 0.01 seconds", you probably mean that to run every 10 ms. However by design the 'Every' condition never runs more than once per tick, so you lose 6 ms every time it runs. The correct solution is to use dt.

    Edit: note your dt logic is also incorrect. If dt > 0.1, you must subtract 0.1 from dt, not set it to zero. Otherwise if dt = 0.11, you set it to zero and lose 0.01 seconds; if you subtract 0.1, it's left on 0.01 so you compensate for the inaccuracy on the next event. You'll also have the same framerate problem with the 0.01 case.

  • Ahh I understand. You learn something new everyday <img src="smileys/smiley16.gif" border="0" align="middle" />

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)