- Sun Aug 23, 2015 10:05 am
eriksl wrote:Way too much #defines, totally unclear what it does.
Check in my code in the universal i/o bridge.
I fail to see why the defines would be the cause of the problem when I don't get any warnings during compilation... and when the same code works in another project.
You won't get warnings because:
- the Espressif SDK prototypes are either missing or incomplete so the compiler can't even know what to expect, so you can throw in almost anything you want and the compiler won't complain; that doesn't mean it will work;
- if you didn't enable some extra (useful) warnings, you won't get much warnings anyway (start with -Wall and -Wextra)
- #defines are processed before the compiler sees the code, so anything you put in a #define can't be reasonably checked by the compiler anyway.
Rules of thumb regarding #define
- don't use them, use enum or static const int instead (that's what they're meant for, and they're not transparent to the compiler)
- it's counterproductive and confusing to #define the obvious, like FALSE and TRUE; "false" must be 0 and "true" must be !0, that's a requirement of the C language, you can rely on that. Equally a null pointer is always 0, so either compare to (your data type *)0 or to 0, but NULL is for Pascal quiche -eaters.
- don't use #define's to size an array or other data type; just declare it numeric (like a) and then use sizeof(a) wherever you need the size, it's so much clearer than #define A_LENGTH 100 a[A_LENGTH] for(....; < A_LENGTH), less prone to errors. You may to calculate byte size to a single object size though using something like e.g. sizeof(a) / sizeof(a).
Bottom line: just don't use #define. And if you really must, use a proper understandable identifier, which especially espressif often fails to achieve (e.g.: ICACHE_FLASH_ATTR).