I am aware that the use of eval()
usually means bad code, but I stumbled upon a weird behavior of the eval()
function in internal functions that I could not understand. If we write:
def f(a):
def g():
print(eval('a'))
return g()
Running f(1)
in this case yields a NameError
, claiming that a
is not defined. However, if we define
def f(a):
def g():
b = a + 1
print(eval('a'))
return g()
Then running f(1)
prints 1
.
There is something happening with local and global variables that I can't quite understand. Is a
only a local variable in g()
when it is "used" for something? What is going on here?
Global variables are the one that are defined and declared outside a function and can be used anywhere. If a variable with same name is defined inside the scope of a function then it will print the value given inside the function only and not the global value.
The reason it fails is that the functions you import from the math module are not local variables inside the function; they are global. So when you read locals() and insert into the dict, it inserts None for every single one. You would see this if you removed the get(key, None) and just accessed locals()[key] directly.
In Python, a variable declared outside of the function or in global scope is known as a global variable. This means that a global variable can be accessed inside or outside of the function. Let's see an example of how a global variable is created in Python.
Global variables are defined outside of all the functions, usually on top of the program. The global variables will hold their type throughout the life-time of your program. A global variable can be accessed by any function.
In short, since eval
is for dynamic evaluation, the interpreter has no way to know it should add a
to the local scope of g
. For efficiency, the interpreter will not add unneeded variables to the dict
of local variables.
From the doc for eval
:
The expression argument is parsed and evaluated as a Python expression (technically speaking, a condition list) using the globals and locals dictionaries as global and local namespace.
This means the functions eval(expression)
will use globals()
as its default global scope and locals()
as its local scope if none are provided.
Although, in you first example a
is in neither.
def f(a):
print("f's locals:", locals())
def g():
print("g's locals:", locals())
print(eval('a'))
return g()
f(1)
Indeed, since the interpreter sees no reference to a
when parsing the body of g
, it does not add it to its local variables.
For it to work, you would need to specify nonlocal a
in g
.
f's locals: {'a': 1}
g's locals: {}
Traceback ...
...
NameError: name 'a' is not defined
In your second example, a
is in g
local variables as it is used in the scope.
def f(a):
print("f's locals:", locals())
def g():
print("g's locals:", locals())
b = a + 1
print("g's locals after b = a + 1:", locals())
print(eval('a'))
return g()
f(1)
f's locals: {'a': 1}
g's locals: {'a': 1}
g's locals after b = a + 1: {'a': 1, 'b': 2}
1
It looks like eval() can only look for variables in local(here,g) or global, but not its parent environment (here f). A walk around is to set the variable as global.
def f(a): global b #note, can not use "global a" directly, will get error:"name 'a' is parameter and global" b=a def g(): print(eval('b')) return g() f(1)
output: 1
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With