Silly question probably. But I'm having issues with adding (or subtracting) numbers in specific amounts every tick.
Example: I have a sprite with a "Size" instance variable, which then sets the sprite's scale. It starts at 0.1, which would equal 10% when setting the spite's scale. Upon being on screen, every tick it will add 0.01 to the instance variable, up to 1, which would be 100% scale. At that point it triggers another action and stops.
My issue is that the trigger is looking for PRECISELY 1. And the system is somehow returning 0.999999997, when I watch the variable count up in debug mode. So the trigger never activates, as the number never reaches exactly 1. This doesn't really make sense to me, as the increment it's set to add isn't even close to that decimal point. I can only assume it's an issue with the every tick behavior? Would I be better off using a specific delta-time action? I'm sure this is basic level programing knowledge. But I'm willing to look dumb, if it helps me eventually be smart 😅
It seems contradictory, but computers can't handle fractions /decimals well - there is always a limit to precision. You should always try to work with whole numbers, and only convert to a decimal at the last moment. So switch to counting by 1, and divide by 100 only when you need the actual scale value.
Thanks for the advice. I was afraid it would wind up being a "symptom of the system" type of thing, but I guess it can't be helped. At least knowing the issue lets me plan around it, like you said. Cheers 🙂
Develop games in your browser. Powerful, performant & highly capable.
Tween might be suitable.
Otherwise before tween was a thing I had made it standard practice for similar situations to use an event to catch the overflow, such as when value>target, set value to target. Using the min() expression can do similar if you want to compact your events.