0 Favourites

# Wait question.

• 11 posts
• Hi, I have a misunderstanding with wait.

If I use "wait 1 second" but the game runs very slow on a slow device, the wait time will not be adapted to game time. Then it will wait 1 second even if the game runs slow.

Is this so ?, I mean, "wait" is independent of the speed of the game.

To solutionate this is appropriate to use a timer behavior, am I right?

https://www.scirra.com/tutorials/56/how-to-use-the-system-wait-action

https://www.scirra.com/tutorials/67/delta-time-and-framerate-independence/en

Develop games in your browser. Powerful, performant & highly capable.

Construct 3 users don't see these ads
• If you want to use wait using game speed, just use tick along with it instead of using a flat number.

Example: Wait (1*dt) meaning wait for a single tick.

• If you want to use wait using game speed, just use tick along with it instead of using a flat number.

Example: Wait (1*dt) meaning wait for a single tick.

Ah, but measuring the time in ticks is complicated I think.

Supposedly 60 tick is a second but in a computer that runs the game slowly, I'm not sure that that works well and the wait is 1 second depending on the speed of the game.

Have you tried it? Sure it works correctly and the time adapts to the speed of execution of the game ?.

My question was, does the timer behavior work like this?

I mean the timer behavior does take into account the speed of the game to work.

• > If you want to use wait using game speed, just use tick along with it instead of using a flat number.

> Example: Wait (1*dt) meaning wait for a single tick.

>

Ah, but measuring the time in ticks is complicated I think.

Supposedly 60 tick is a second but in a computer that runs the game slowly, I'm not sure that that works well and the wait is 1 second depending on the speed of the game.

Have you tried it? Sure it works correctly and the time adapts to the speed of execution of the game ?.

My question was, does the timer behavior work like this?

I mean the timer behavior does take into account the speed of the game to work.

Wait 1 second is based on real time (I belive), so if your game is running at 10fps on a device and 60 on another, 1 second would pass at the same time, while (60*dt) would take a second to "complete" on a device capable of 60 fps while taking 6 seconds on a device capable of 10 fps...

• If I use "wait 1 second" but the game runs very slow on a slow device, the wait time will not be adapted to game time. Then it will wait 1 second even if the game runs slow.

Yes, and No. Wait is fps independent until the game reaches the minimum fps. Standard the minimum fps = 30 fps.

You can lower the minimum fps with the system action: system > Set minimum framerate.

But, if you do so, most position based conditions (is overlapping/on collision) will fail. As explained in the manual:

Set minimum framerate

Set the maximum delta-time (dt) value based on a framerate. The default minimum framerate is 30 FPS, meaning the maximum dt is 1 / 30 (= 33ms). If the framerate drops below 30 FPS, dt will still not exceed 1/30. This has the effect of the game going in to slow motion as it drops below the minimum framerate, rather than objects stepping further every frame to keep up the same real-world speed. This helps avoid skipped collisions due to stepping a very large distance every frame.

In general, everywhere you need the fill in a value in seconds or something/second, this is fps independent by nature. Timer behavior needs input in seconds, so it is fps independent, to the explained limits.

• If I use "wait 1 second" but the game runs very slow on a slow device, the wait time will not be adapted to game time. Then it will wait 1 second even if the game runs slow.

Yes, and No. Wait is fps independent until the game reaches the minimum fps. Standard the minimum fps = 30 fps.

You can lower the minimum fps with the system action: system > Set minimum framerate.

But, if you do so, most position based conditions (is overlapping/on collision) will fail. As explained in the manual:

Set minimum framerate

Set the maximum delta-time (dt) value based on a framerate. The default minimum framerate is 30 FPS, meaning the maximum dt is 1 / 30 (= 33ms). If the framerate drops below 30 FPS, dt will still not exceed 1/30. This has the effect of the game going in to slow motion as it drops below the minimum framerate, rather than objects stepping further every frame to keep up the same real-world speed. This helps avoid skipped collisions due to stepping a very large distance every frame.

In general, everywhere you need the fill in a value in seconds or something/second, this is fps independent by nature. Timer behavior needs input in seconds, so it is fps independent, to the explained limits.

This is a blow to me, I understand what you say ... I thought that timer behavior would adapt to the speed of the game, now I have my game using timer behavior and I guess it will not work well on slow devices.

So I guess the best is what someone said before:

Number of ticks to wait * dt

60 ticks = 1 second

• Make 'something' fps independent by using dt is exactly what behaviors (bullet,platform,timers ...) and also 'wait' & and every X seconds do already.

Same system means same problems when crossing the limits.

(is 10 fps not a bit to low ?)

If you want to accurately measure time, use the system expression 'wallclocktime'.

But even then, you are, well, a little bit screwed. Say it runs at 10 fps. Then 1 tick will be one tenth of a second.

Or, comparing recent wallclocktime to previous wallclocktime happens every 1/10 of a second. So the error is a average not accumulating dt.

Still better then an accumulating error each tick, when going lower then 30 fps.

Sorry if sound confusing.

But, in the end. It might be better to optimise the game. So it dont run lower then 25 fps. That is the best solution.

And do you really want to support devices that are not even anymore supported by there manufactures ?

• Make 'something' fps independent by using dt is exactly what behaviors (bullet,platform,timers ...) and also 'wait' & and every X seconds do already.

Same system means same problems when crossing the limits.

(is 10 fps not a bit to low ?)

If you want to accurately measure time, use the system expression 'wallclocktime'.

But even then, you are, well, a little bit screwed. Say it runs at 10 fps. Then 1 tick will be one tenth of a second.

Or, comparing recent wallclocktime to previous wallclocktime happens every 1/10 of a second. So the error is a average not accumulating dt.

Still better then an accumulating error each tick, when going lower then 30 fps.

Sorry if sound confusing.

But, in the end. It might be better to optimise the game. So it dont run lower then 25 fps. That is the best solution.

And do you really want to support devices that are not even anymore supported by there manufactures ?

I think I understand, in my game in many cases I should have used flags instead of time measurement.

I think I have to take into account devices with android 4.x.x, I suppose these devices might not run the game at 60fps, the amount of devices with android 4.x.x is still great.

• Mirlas, this is not a solution, just an illustration. The loop simulates a very very slow device. 7 fps on my machine.