How interpreter of Python recognize undefined global variable (a) in function in the following code ?
def show():
print(a)
a = 1
show()
Python is interactive language, so it processes each line of code line by line.
Given this, it should throw an error at the line with undefined variable (print(a)).
However, it works without error.
How does interpreter of Python recognize the undefined variable (a) ?
Or is it just recognized as letters until show function is called?
I converted the above code to bytecode, but I didn't understand well it...
When you define your function inside a python interpreter, python treats it as a sort of black box. It initializes the variables that are not defined inside and outside the function as free variables. Then, it stores a reference to the function inside the global table (you can access it using globals()). This global table holds the values for global variables and global function references.
When you define the variable a python stores it inside the global dictionary as well. Just like the function before it.
After that when you call your function, python sees a variable a. It knows that the variable is free, therefore, must be declared inside the global variable by now. Then it looks up the global table and uses the value that is stored.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With