09 Aug, 2011, Rarva.Riendf wrote in the 21st comment:
Votes: 0
Nice solution, but I would not apply it in this case. Mostly for maintenance reason (and personal preferences), I prefer to see the MAX_GROUP where it is used, ie in the array declaration. Especially since those table does not evolve that frequently. But if I had code that would take a long time to compile, or working with multiple coder that would frequently modify the table I would definitely use it.
And since we are in the array declaration problems, how do you do with multiple diemsion array. I think that you are forced to declare a size for the third column as soon as you have a 3 dimensional array. (Or at least the compiler would not let me go without it)
10 Aug, 2011, David Haley wrote in the 22nd comment:
And since we are in the array declaration problems, how do you do with multiple diemsion array. I think that you are forced to declare a size for the third column as soon as you have a 3 dimensional array. (Or at least the compiler would not let me go without it)
There's no way for the compiler to calculate the size of the 2nd and 3rd dimensions. So you can only leave off the first dimension. This is legal: int foo[][3][4] = {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24}; int bar[][3] = {1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24}; Because the compiler can calculate that the first dimension of foo is 2 and bar is 8.
Now one would think a C compiler could calculate the dimensions if you include the correct scalar initializers… int foo [][][] = {{{1,2,3,4},{5,6,7,8},{9,10,11,12}},{{13,14,15,16},{17,18,19,20},{21,22,23,24}}}; …but gcc and vc++ do not. Maybe because they don't force the use scalar initializer notation, only validate it.
And since we are in the array declaration problems, how do you do with multiple diemsion array. I think that you are forced to declare a size for the third column as soon as you have a 3 dimensional array. (Or at least the compiler would not let me go without it)