Do such parsing tools exist for R? It does not have to be a lex / yacc compatible tool for my case.
(I'm an R newbie)
EDIT: I'm interested in implementing another language using R.
I developed a clone of python PLY named rly. You can find it in CRAN:
install.packages("rly")
Example of usage below:
library(rly)
TOKENS = c('NAME', 'NUMBER')
LITERALS = c('=','+','-','*','/', '(',')')
Lexer <- R6Class("Lexer",
public = list(
tokens = TOKENS,
literals = LITERALS,
t_NAME = '[a-zA-Z_][a-zA-Z0-9_]*',
t_NUMBER = function(re='\\d+', t) {
t$value <- strtoi(t$value)
return(t)
},
t_ignore = " \t",
t_newline = function(re='\\n+', t) {
t$lexer$lineno <- t$lexer$lineno + nchar(t$value)
return(NULL)
},
t_error = function(t) {
cat(sprintf("Illegal character '%s'", t$value[1]))
t$lexer$skip(1)
return(t)
}
)
)
Parser <- R6Class("Parser",
public = list(
tokens = TOKENS,
literals = LITERALS,
# Parsing rules
precedence = list(c('left','+','-'),
c('left','*','/'),
c('right','UMINUS')),
# dictionary of names
names = new.env(hash=TRUE),
p_statement_assign = function(doc='statement : NAME "=" expression', p) {
self$names[[as.character(p$get(2))]] <- p$get(4)
},
p_statement_expr = function(doc='statement : expression', p) {
cat(p$get(2))
cat('\n')
},
p_expression_binop = function(doc="expression : expression '+' expression
| expression '-' expression
| expression '*' expression
| expression '/' expression", p) {
if(p$get(3) == '+') p$set(1, p$get(2) + p$get(4))
else if(p$get(3) == '-') p$set(1, p$get(2) - p$get(4))
else if(p$get(3) == '*') p$set(1, p$get(2) * p$get(4))
else if(p$get(3) == '/') p$set(1, p$get(2) / p$get(4))
},
p_expression_uminus = function(doc="expression : '-' expression %prec UMINUS", p) {
p$set(1, -p$get(3))
},
p_expression_group = function(doc="expression : '(' expression ')'", p) {
p$set(1, p$get(3))
},
p_expression_number = function(doc='expression : NUMBER', p) {
p$set(1, p$get(2))
},
p_expression_name = function(doc='expression : NAME', p) {
p$set(1, self$names[[as.character(p$get(2))]])
},
p_error = function(p) {
if(is.null(p)) cat("Syntax error at EOF")
else cat(sprintf("Syntax error at '%s'", p$value))
}
)
)
lexer <- rly::lex(Lexer)
parser <- rly::yacc(Parser)
while(TRUE) {
cat('calc > ')
s = readLines(file("stdin"), n=1)
if(s == 'exit') break
parser$parse(s, lexer)
}
AFAIK, there is no parser generator for R.
However, user created packages in R (a.k.a. "extensions") can be written in Java, C or Fortran (and R, of course). So, you could use Lex/Yacc and Bison (in case of C) or JavaCC or ANTLR (for Java) to create a lexer and parser for your language and use those in your R code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With