Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How interpreter of Python recognize undefined global variable in function?

How interpreter of Python recognize undefined global variable (a) in function in the following code ?

def show():
    print(a)

a = 1
show()

Python is interactive language, so it processes each line of code line by line.

Given this, it should throw an error at the line with undefined variable (print(a)). However, it works without error.

How does interpreter of Python recognize the undefined variable (a) ? Or is it just recognized as letters until show function is called?

I converted the above code to bytecode, but I didn't understand well it...

like image 385
D H Avatar asked Apr 27 '26 14:04

D H


1 Answers

When you define your function inside a python interpreter, python treats it as a sort of black box. It initializes the variables that are not defined inside and outside the function as free variables. Then, it stores a reference to the function inside the global table (you can access it using globals()). This global table holds the values for global variables and global function references.
When you define the variable a python stores it inside the global dictionary as well. Just like the function before it.
After that when you call your function, python sees a variable a. It knows that the variable is free, therefore, must be declared inside the global variable by now. Then it looks up the global table and uses the value that is stored.

like image 112
ARK1375 Avatar answered Apr 29 '26 02:04

ARK1375