Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does #define not require a semicolon?

Tags:

I was writing some test code in C. By mistake I had inserted a ; after a #define, which gave me errors. Why is a semicolon not required for #defines?

More specifically :

Method 1: works

const int MAX_STRING = 256;

int main(void) {
    char buffer[MAX_STRING];
}

Method 2: Does not work - compilation error.

#define MAX_STRING 256;

int main(void) {
    char buffer[MAX_STRING];
}

What is the reason of the different behavior of those codes? Are those both MAX_STRINGs not constants?

like image 869
Shash Avatar asked May 24 '12 09:05

Shash


1 Answers

#define MAX_STRING 256;

means:

whenever you find MAX_STRING when preprocessing, replace it with 256;. In your case it'll make method 2:

#include <stdio.h>
#include <stdlib.h>
#define MAX_STRING 256;

int main(void) {
    char buffer [256;];
}

which isn't valid syntax. Replace

#define MAX_STRING 256;

with

#define MAX_STRING 256

The difference between your two codes is that in first method you declare a constant equal to 256 but in the second code you define MAX_STRING to stand for 256; in your source file.

The #define directive is used to define values or macros that are used by the preprocessor to manipulate the program source code before it is compiled. Because preprocessor definitions are substituted before the compiler acts on the source code, any errors that are introduced by #define are difficult to trace.

The syntax is:

#define CONST_NAME VALUE

if there is a ; at the end, it's considered as a part of VALUE.

to understand how exactly #defines work, try defining:

#define FOREVER for(;;)
...
    FOREVER {
         /perform something forever.
    }

Interesting remark by John Hascall:

Most compilers will give you a way to see the output after the preprocessor phase, this can aid with debugging issues like this.

In gcc it can be done with flag -E.

like image 122
xenteros Avatar answered Oct 08 '22 11:10

xenteros