I've written a library to match strings against a set of patterns and I can now easily embed lexical scanners into C programs.
I know there are many well established tools available to create lexical scanners (lex and re2c, to just name the first two that come to mind) this question is not about lexers, it's about the best approach to "extend" C syntax. The lexer example is just a concrete case of a general problem.
I can see two possible solutions:
I've already done both but the question is: "which one would you consider a better practice according the following criteria?"
In other words, if you had to maintain or write a piece of software that is using one of the two approaches, wich one will disappoint you less?
As an example, here is a lexer for the following problem:
In the two styles.
/**** SCANNER STYLE 1 (preprocessor) ****/
#include "pmx.h"
t = buffer
while (*t) {
switch pmx(t) { /* the preprocessor will handle this */
case "&q" : /* skip strings */
break;
case "&f<?=eE>&F" : /* sum numbers */
sum += atof(pmx(Start,0));
break;
case "&b()": /* skip lists */
break;
case "&iend" : /* stop processing */
t = "";
break;
case "<.>": /* skip a char and proceed */
break;
}
}
/**** SCANNER STYLE 2 (macros) ****/
#include "pmx.h"
/* There can be up to 128 tokens per scanner with id x80 to xFF */
#define TOK_STRING x81
#define TOK_NUMBER x82
#define TOK_LIST x83
#define TOK_END x84
#define TOK_CHAR x85
pmxScanner( /* pmxScanner() is a pretty complex macro */
buffer
,
pmxTokSet("&q" , TOK_STRING)
pmxTokSet("&f<?=eE>&F" , TOK_NUMBER)
pmxTokSet("&b()" , TOK_LIST)
pmxTokSet("&iend" , TOK_END)
pmxTokSet("<.>" , TOK_CHAR)
,
pmxTokCase(TOK_STRING) : /* skip strings */
continue;
pmxTokCase(TOK_NUMBER) : /* sum numbers */
sum += atof(pmxTokStart(0));
continue;
pmxTokCase(TOK_LIST): /* skip lists */
continue;
pmxTokCase(TOK_END) : /* stop processing */
break;
pmxTokCase(TOK_CHAR) : /* skip a char and proceed */
continue;
);
Should anyone be interested in the current implementation, the code is here: http://sites.google.com/site/clibutl .
Preprocessor will offer a more robust and generic solution. Macros on the other hand are quick to whip up, provide a good proof-of-concept and easy when the sample keyword/token space is small. Scaling up/including new features may become tedious with macros after a point. I'd say whip up macros to get started and then convert them to your preprocessor commands.
Also, try to be able to use a generic preprocessor rather than writing your own, if possible.
[...] I would have another dependencies to handle (m4 for Windows, for example).
Yes. But so would any solution you write :) -- and you have to maintain it. Most of the programs you've names have a Windows port available (e.g. see m4 for windows). The advantages of using such a solution is you save a lot of time. Of course, the downside is you probably have to get upto speed with the source code, if and when the odd bug turns up (though the folks maintaining these are very helpful and will certainly make sure you have every help).
And again, yes, I'd prefer a packaged solution to rolling my own.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With