Unlike C++, C has a (sort of) binary model. You can compile with different compilers and with a bit of luck things will link together.
The big exception is enums. The standard says any size, so no binary model.
But what do compilers do in practice. Without magic annotations, most of them seem to use ints. Is that correct?
Our approach is to never use enums, but only #defines. Which feels much more like K&R, free from these new fangled typing constructs.
(Structs are another issue, but seem to align OK in practice.)
Comments
Post a Comment