Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python - Zelle book uses eval(), is it wrong?

Tags:

python

eval

PLEASE NOTE: This is NOT about the use of eval(), it is about the potential quality (or lack thereof) of a book it is used and taught in. SO already has countless threads about eval() in Python.

Risking to invite the wrath and downvotes of SO, I nonetheless decided to ask this question, just in case. Please bear with me. I've tried Google and SO itself for this specific question (as you will see) and got nothing. I might be blind, though.

This question is about the use of the notorious eval() function.

There is a relatively well-known (and well-reviewed, as you can see) book by John Zelle: http://www.amazon.com/Python-Programming-Introduction-Computer-Science/dp/1590282418/ref=pd_sim_b_3

Technically, it is a CS1 book which uses Python as the programming language. Fair enough, that kind of takes some responsibility off of the author's shoulders ("Hey, I'm trying to teach you something broad here, not all these syntax and security details"), but when I started reading it I noticed, in literally the very first example, the use of

x = eval(input("Enter your number: "))

where x should be an int and thus we need to convert user input into an int.

I'm using Python 2.7.4 and the book is about Python 3, so I faced quite a lot of problems with print() and input() and eval() right from the very beginning and had to do some research to get the examples to work. In the course of my research, I've read countless opinions about eval() in Python (mostly here on SO) which boil down to it being almost always bad, a security risk, an unnecessary technical overhead and so on. The questions of the users were a lot more elaborate (there was one about using eval() when doing a wxPython project), so I can't vouch for total similarity between my case and their cases, but still...

So, I admit, I'm not too far into the book, but I've reached the point where, a bit later on, the author explained the use of eval() with no reference to its controversial nature whatsoever. He basically said what I just said: we need x to eventually be an int, so here's a handy way to do that. And he seems to be using it ever after.

My question is this: if, right from the beginning, an author makes such a mistake (or is it NOT a mistake? I might be missing something here), is it a book worth learning from? I believe Mr. Zelle is a great teacher of CS, and it shows, but whether he wants it or not, people will still learn Python from his book as well, in addition to algorithms and the art of programming. So is it worth learning Python from a book which stays silent over such a seemingly universal issue in the Python community? I don't want Mr. Zelle to be a Python hacker and uncover all of its secrets, but little details like these can make or break someone who's self-teaching/self-learning. What will your advice be with regards to this learning material?

P.S. On the other hand, making me do quite a bit of research and experimentation (unwittingly) right from the start is pretty cool :-)

Thank you!

like image 274
Leo Avatar asked Apr 14 '13 04:04

Leo


5 Answers

As the author of the book in question, let me weigh in on this issue.

The use of eval in the book is largely a historical artifact of the conversion from Python 2 to Python 3 (although the same "flaw" exists in the use of input in Python 2). I am well aware of the dangers of using eval in production code where input may come from an untrusted source, but the book is not about production code for a web-based system; it's about learning some CS and programming principles. There's really nothing in the book that could be remotely considered production code. My goal is always to use the simplest approach that allows me to illustrate the point I am trying to make, and eval helps to do that.

I do not agree with the crowd proclaiming eval evil in all contexts. It's very handy for simple programs and scripts that are only being run by their writer. In that context, it's perfectly safe. It allows for simple multiple inputs and expressions as input. Pedagogically, it emphasizes the concept of expression evaluation. Eval exposes all the power (and danger) of an interpreted language. I use eval all the time in my own personal programs (and not just in Python). In hindsight, I absolutely agree that I should have included some discussion of the potential risks of eval; this is something I always do in my classes, anyway.

The bottom line is that there are numerous ways this book could be improved (there always are). I don't think using eval is a fatal flaw; it is appropriate for the types of programs being illustrated and the context in which those programs appear. I am not aware of any other "insecurities" in the way Python is used in the book, but you should be warned (as the Preface explains) that there are numerous places where the code is not exactly "Pythonic."

like image 91
John Zelle Avatar answered Oct 12 '22 05:10

John Zelle


Since eval is so out of place and unnecessary in the example you give, I would certainly have doubts about the safety of other parts of the book. Is the author going to suggest that you append a user entered string to a SQL query?

I think it could be worth finding the author's email address and asking him about it directly.

like image 44
Joshua D. Boyd Avatar answered Oct 12 '22 03:10

Joshua D. Boyd


Yes, it's wrong. But I think I know why it's in there.

Lots of people use input() in Python 2.x, which is a very unfortunately named function since it doesn't just read input, it also evaluates it. The converter 2to3 converts each use of input() to eval(input()), as you can see:

$ cat test.py
x = input("Enter your number: ")

$ 2to3 test.py
RefactoringTool: Skipping implicit fixer: buffer
RefactoringTool: Skipping implicit fixer: idioms
RefactoringTool: Skipping implicit fixer: set_literal
RefactoringTool: Skipping implicit fixer: ws_comma
RefactoringTool: Refactored test.py
--- test.py     (original)
+++ test.py     (refactored)
@@ -1 +1 @@
-x = input("Enter your number: ")
+x = eval(input("Enter your number: "))
RefactoringTool: Files that need to be modified:
RefactoringTool: test.py

So my guess is that it is just a little sloppy. From the Amazon description:

This is the second edition of John Zelle's Python Programming, updated for Python 3.

I think someone ran 2to3 on all of the code samples without checking the output thoroughly enough. So yes, it was a mistake to use input() in Python 2.x, and it was a mistake to use 2to3 without checking the output.

like image 27
Dietrich Epp Avatar answered Oct 12 '22 05:10

Dietrich Epp


Well, to combine eval() and input() in such a way creates a rudimentary but possibly very harmful 'shell'. I haven't read the book, but I'd take it with a grain of salt. It's not just bad practice, it implements one deadly combo of functions.

like image 21
Chris Richter Avatar answered Oct 12 '22 04:10

Chris Richter


Yes eval should be spelled evil instead to warn people about this ;) You should try and never use it unless you absolutely have to. In this case it's intuitive to use int() instead, it's even much more readable! Also if you really had to you could use ast.literal_eval (it only evaluates literals as the name implies, so it won't allow the user to run malicious code) which is actually safe, but there is no need for this and there is no need for eval() in this case.

like image 24
jamylak Avatar answered Oct 12 '22 04:10

jamylak