Every X Milliseconds problem

This forum is currently in read-only mode.
  • http://dl.dropbox.com/u/6660860/timer.cap

    This is most likely me being an idiot, but could anyone explain to me why the "bad timer" counts the milliseconds so slowly? You will notice i am increasing the Millisecond variable by 1 every 1 millisecond and then resetting it to 0 when it reaches 1000 (1 second) and then increasing the Seconds variable by 1, but it takes FOREVER to reach 1000.

    If you look at the "good timer" however, if i simply increase the seconds by 1 every 1000ms it works just fine.

    I hope i made sense here.

    Thanks

  • That's just because you ignored what you can read if you open the wizard for the "every x milliseconds"-parameters. It says:

    "This is only accurate to a resolution of ~10 milliseconds."

    Well, you use a resolution of 1ms...

    Set it to a value higher than 10 and it will work.

  • There are a couple of other things I can add about this.

    As tulamide mentioned, Construct warns about the resolution of the every x milliseconds condition. However, it's generally not even good to 10ms resolution under the default settings. Under the application settings, "Framerate Mode" is set by default to "V-Synced", which effectively limits the resolution to 1/60 (or 16.667 ms) on my computer. It lags on mine until I set the increment to 17 or more.

    It's actually close to keeping up with the 1ms increment if I change the framerate mode to "Unlimited", where I get a framerate of around 2800. In practice, I've never felt the need to go below 50ms for any such condition, myself.

    Also, I've found it to be a good practice to not rely on exact equality checks for counters such as what you use here. You would notice that it breaks if 1000 is not a multiple of the increment value (as with 17.) I would usually use:

    + System: Is global variable 'MillisecondsB' Greater or equal 1000

    .

  • Citnarf

    If you simply want to have the time from the start of the application you could use the system expression "Timer", which gives the total elapsed milliseconds. Then to split it up into minutes, seconds and milliseconds, use the following formulas:

    minutes:		floor(Timer / 1000 / 60)
    seconds:		Timer / 1000 % 60
    milliseconds:	Timer % 1000[/code:rjzpz98p]
  • Ahh thanks a ton guys, your right i didn't read the text in the wizard at all .

    R0J0hound, i will definitely give your timer idea a shot, thanks!

  • Two threads that deepen R0J0hound's code snippet and could be of further help:

  • This is actually an interesting problem because the Every event can only trigger at most once per tick (like Always). So if you're running at 10fps, the Every event will only trigger at most every 100ms. This is kind of awkward because it can make things framerate dependent again, even in well-designed, framerate-independent games.

    I thought about making 'Every X milliseconds' play catch-up - as in, in the above case at 10fps, if you have 'every 50ms', the condition will realise it should have triggered twice since the last tick, so then fires itself twice. This could create other subtle problems in events though. For example, the current time will be the same for both triggers (since it's actually running twice at the same time), and if it's important that other events run after the Every event, then that could break if the framerate drops and the Every starts re-triggering repeatedly to play catch-up. (One example is if you're firing a gun on Every, you start getting multiple bullets spawned at the same time.) But, on the other hand, logically you'd expect 'Every 10ms' to mean '100 times a second'!

    It's a tricky area... what do you think should happen? Do you think the current system (capped at once per tick) is OK?

  • This is actually an interesting problem because the Every event can only trigger at most once per tick (like Always). So if you're running at 10fps, the Every event will only trigger at most every 100ms. This is kind of awkward because it can make things framerate dependent again, even in well-designed, framerate-independent games.

    I thought about making 'Every X milliseconds' play catch-up - as in, in the above case at 10fps, if you have 'every 50ms', the condition will realise it should have triggered twice since the last tick, so then fires itself twice. This could create other subtle problems in events though. For example, the current time will be the same for both triggers (since it's actually running twice at the same time), and if it's important that other events run after the Every event, then that could break if the framerate drops and the Every starts re-triggering repeatedly to play catch-up. (One example is if you're firing a gun on Every, you start getting multiple bullets spawned at the same time.) But, on the other hand, logically you'd expect 'Every 10ms' to mean '100 times a second'!

    It's a tricky area... what do you think should happen? Do you think the current system (capped at once per tick) is OK?

    Yeah for v-sync you just about have to do that.

    Would it be possible to have a an expression, or perhaps a plug that isn't frame rate dependent?

  • Try Construct 3

    Develop games in your browser. Powerful, performant & highly capable.

    Try Now Construct 3 users don't see these ads
  • It's a tricky area... what do you think should happen? Do you think the current system (capped at once per tick) is OK?

    Given that everyone is aware of the implementation I would prefer the current system. Nearly every end user time system uses ...minutes:seconds.somekindofframes (instead of milliseconds, e.g. video editors use the video's framerate, games most often 10th of a second, music uses sample frames, etc.) anyway. And I can't think of a useful implementation for triggering events every 3 ms. The low framerate problem could be compensated by constantly checking the framerate thus getting the lowest possible resolution for "every ..."

  • It's a tricky area... what do you think should happen? Do you think the current system (capped at once per tick) is OK?

    I can't really think of any good way for Construct to try and handle such a situation, other than how it does now. Since it really depends upon what one is doing at each interval, I don't think that there can be a catch-all solution.

    I would implement a variation of TimeDelta, as a function, which would report the time elapsed since the last call. Like so:

    + Function: On function "TimePeriodDelta"
    	+ System: Is global variable 'previousTime' Different to 0
    		-> System: Set global variable 'currentTime' to Timer
    		-> Function: Set return value to (global('currentTime') - global('previousTime')) / Function.Param(1)
    		-> System: Set global variable 'previousTime' to global('currentTime')
    	+ System: Else
    		-> System: Set global variable 'previousTime' to Timer
    		-> Function: Set return value to 0[/code:ypm3073b]
    
    This uses two globals, though it would also be good enough without 'currentTime', and accessing the Timer twice instead of once. Anyway, the first time it's called, it returns zero and starts counting time, then every call after will return the difference since last call. It takes one parameter, which the difference in milliseconds is divided by. Pass it the value 1 and it will return milliseconds, or pass it the same interval as your 'every ... milliseconds' condition, and the desired result would be one.
    
    For instance, for a '5 damage per 50ms' event, I pass the function 50, using it in expression form as a multiplier for the desired damage. Much like the TimeDelta.
    
    [code:ypm3073b]+ System: Every 50 milliseconds
    -> System: Subtract 5 * Function.TimePeriodDelta(50) from global variable 'Health'[/code:ypm3073b]
    
    I think a built-in such as this could be useful, but it's not difficult to implement anyway.
    
    I made a simple .cap to test this, which simply appends each result into an EditBox. I did get some odd intervals with low fixed frame rates.
    
    v0.99.84: [url]http://dl.dropbox.com/u/5868916/TimePeriodDelta.cap[/url]
  • If I use "Every 50 milliseconds" -> add 1 to value "counter" and fps is low (15 for example) then it calculates the value too slow. It works if fps is about 50.

    Why this depends on fps? Any idea how to fix this?

    I just would like to calculate time after object X has been destroyed and when time has passed for 12 seconds (equally 1200 milliseconds) after that, something happens :S

  • I just would like to calculate time after object X has been destroyed and when time has passed for 12 seconds (equally 1200 milliseconds) after that, something happens :S

    That's not, what "every x milliseconds" is designed for. It should be used to repeat actions in regular intervals (like a stop light flashing red every 20 seconds, etc.).

    For actions like the one described you have other tools. I suggest using the system expression "Get timer". It returns the amount of time the current layout has been running. I am used to create a pv/global I call "timestamp". Just fill it with "Get Timer" whenever you want to mark a point in time. To test if 12 seconds have passed you compare "Get Timer" - "timestamp" >= 12000.

    If you are dealing with many objects you should setup an index for every object and a 1-dimensional array, where the x-dimension corresponds to the index of the object. Instead of just comparing one object you then need to loop-compare. Or, if you just make the objects invisible and place them outside the layout instead of destroying them, you could test for the passed time via the above timestamp pv example and destroy them not until these 12 seconds have passed.

    Edit: typo, 1200 corrected to 12000

  • A simple way to deal with the original problem:

    Always: Add 1000*TimeDelta to 'millisecondcounter'

Jump to:
Active Users
There are 1 visitors browsing this topic (0 users and 1 guests)