Hi there, everyone.
I am programming a dice roll into a function that does exploding dice on 10s and critical failures on 1s.
I am currently "using int(random(1,11))", since random picks a value one less than the second number, this should roll between 1 and 10.
However, I suspect that these numbers are skewed against 1s and 10s because of int. Int rounds to the nearest integer, so when I roll my random value, a 2 would trigger on a value between 1.5 and 2.49, right? That also means that a 1 can only trigger from 1 to 1.49 and a 10 only on 9.5 to 10, reducing the probability of those numbers by half.
Doesn't that mean that I should use "int(random(0.5,11.5))" in order to let 1s roll on 0.5 to 1.49 and tens to roll on 9.5 and 10.49?