Wednesday, 21 August 2013

In C#, can multiplications inside of an array declaration/initialization cause mayhem?

In C#, can multiplications inside of an array declaration/initialization
cause mayhem?

I am going a bit crazy here with the following section of code
float f = 1.25f;
FLIPPER_CENTERS = new float[,] {
{ (20*f), (27*f) }, { FLIPPER_WIDTH - (20*f), (27*f)},
{ (6*f), (25*f) }, { MH_FLIPPER_WIDTH- (6*f), (25*f) },
{ (8), (15)}, { (SMALL_FLIPPER_WIDTH - 8), (15)},
{ (8), (20)}, { (67 - 8), (20)},
};
If I print the values of the first element of that array, I get [0, 0].
The last two elements are [59, 20], as expected.
The first value is supposed to be [25, 33,75]. Which I can get if I
substitute (20*f) for (20*1.25f).
{ (20*1.25f), (27*1.25f) }, { FLIPPER_WIDTH - (20*f), (27*f)},
So here is the problem: if I leave the multiplication by f in the array
initialization, the values are 0. However, if I change f to 1.25f all is
good.
I have tried to figure out what is going on, but to no avail. I am certain
the value of f is 1.25f and not 0. Can anyone shed some light on this for
me please?

No comments:

Post a Comment