Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Security of Python's eval() on untrusted strings?

eval() will allow malicious data to compromise your entire system, kill your cat, eat your dog and make love to your wife.

There was recently a thread about how to do this kind of thing safely on the python-dev list, and the conclusions were:

  • It's really hard to do this properly.
  • It requires patches to the python interpreter to block many classes of attacks.
  • Don't do it unless you really want to.

Start here to read about the challenge: http://tav.espians.com/a-challenge-to-break-python-security.html

What situation do you want to use eval() in? Are you wanting a user to be able to execute arbitrary expressions? Or are you wanting to transfer data in some way? Perhaps it's possible to lock down the input in some way.


You cannot secure eval with a blacklist approach like this. See Eval really is dangerous for examples of input that will segfault the CPython interpreter, give access to any class you like, and so on.


You can get to os using builtin functions: __import__('os').

For python 2.6+, the ast module may help; in particular ast.literal_eval, although it depends on exactly what you want to eval.


Note that even if you pass empty dictionaries to eval(), it's still possible to segfault (C)Python with some syntax tricks. For example, try this on your interpreter: eval("()"*8**5)


You are probably better off turning the question around:

  1. What sort of expressions are you wanting to eval?
  2. Can you insure that only strings matching some narrowly defined syntax are eval()d?
  3. Then consider if that is safe.

For example, if you are wanting to let the user enter an algebraic expression for evaluation, consider limiting them to one letter variable names, numbers, and a specific set of operators and functions. Don't eval() strings containing anything else.