In internal modules like peephole, argument of LOAD_CONST
is stored in the two bytes following the opcode
.
For example, the macro it uses to get argument of an operation is implemented as:
#define GETARG(arr, i) ((int)((arr[i+2]<<8) + arr[i+1]))
The argument of LOAD_CONST
is an index into the consts
array.
So I guessed maybe we can only use at most 2 ^ 16 constants in a Python function.
But when I experiment with a function that use 66666 (> 65536) constants, it still runs normally.
What could be the reason?
Constants enable you to use the same name to identify the same value throughout your code. If you need to update the constant's value, then you don't have to change every instance of the value. You just have to change the value in a single place: the constant definition.
Therefore, in one byte_compile, theoretically a function can only have 65536 local variables.
Why are class-level constants discouraged? "There's no equivalent for constants in Python, as the programmer is generally considered intelligent enough to leave a value he wants to stay constant alone".
Python uses four types of constants: integer, floating point, string, and boolean.
From the dis
docs:
EXTENDED_ARG(ext)
Prefixes any opcode which has an argument too big to fit into the default two bytes. ext holds two additional bytes which, taken together with the subsequent opcode’s argument, comprise a four-byte argument, ext being the two most-significant bytes.
If an opcode needs an argument longer than 2 bytes, an EXTENDED_ARG opcode provides 2 more bytes of argument.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With