Definition
"knows" the way how a value for a symbol was defined: using Set
or SetDelayed
. But how? As I understand, after a value for a symbol was assigned there is no any difference for the evaluator how it was assigned: by using Set
or SetDelayed
. It can be illustrated by the function OwnValues
which always returns definitions with the Head
RuleDelayed
. How Definiton
obtains this information?
In[1]:= a=5;b:=5;
Definition[a]
Definition[b]
OwnValues[a]
Out[2]= a=5
Out[3]= b:=5
Out[4]= {HoldPattern[a]:>5}
Symbols—such as gestures, signs, objects, signals, and words—help people understand that world. They provide clues to understanding experiences by conveying recognizable meanings that are shared by societies.
ˈsim-bəl. : something that stands for something else. especially : something real that stands for or suggests another thing that cannot in itself be pictured or shown. the lion is a symbol of courage. : a letter, character, or sign used instead of a word or group of words.
Writers use symbolism to explain an idea or concept to their readers in a poetic manner without saying it outright. The use of symbolism allows writers to make their stories more complex. Many people also use symbolism in everyday life.
OwnValues[a] = {HoldPattern[a] -> 3}; OwnValues[a]
gives {HoldPattern[a] :> 3}
instead of {HoldPattern[a] -> 3}
but Definition[a]
shows what one can expect. Probably this definition is stored internally in the form of Rule
but is converted to RuleDelayed
by OwnValues
for suppressing of evaluation of the r.h.s of the definition. This hypothesis contradicts my original understanding that there are no difference between values assigned by Set
and SetDelayed
. Probably such definitions are stored in different forms: Rule
and RuleDelayed
correspondingly but are equivalent from the evaluator's point of view.
It is interesting to see how MemoryInUse[]
depends on the kind of definition.
In the following experiment I used the kernel of Mathematica 5.2 in interactive session without the FrontEnd. With the kernels of Mathematica 6 and 7 one will get different results. One reason for this is that in these versions Set
is overloaded by default.
First of all I evaluate $HistoryLength=0;
for having DownValues
for In
and Out
variables not affecting my results. But it seems that even when $HistoryLength
is set to 0 the value of In[$Line]
for current input line is still stored and removed after entering new input. This is likely the reason why result of the first evaluation of MemoryInUse[]
always differs from the second.
Here is what I have got:
Mathematica 5.2 for Students: Microsoft Windows Version
Copyright 1988-2005 Wolfram Research, Inc.
-- Terminal graphics initialized --
In[1]:= $HistoryLength=0;
In[2]:= MemoryInUse[]
Out[2]= 1986704
In[3]:= MemoryInUse[]
Out[3]= 1986760
In[4]:= MemoryInUse[]
Out[4]= 1986760
In[5]:= a=2;
In[6]:= MemoryInUse[]
Out[6]= 1986848
In[7]:= MemoryInUse[]
Out[7]= 1986824
In[8]:= MemoryInUse[]
Out[8]= 1986824
In[9]:= a:=2;
In[10]:= MemoryInUse[]
Out[10]= 1986976
In[11]:= MemoryInUse[]
Out[11]= 1986952
In[12]:= MemoryInUse[]
Out[12]= 1986952
In[13]:= a=2;
In[14]:= MemoryInUse[]
Out[14]= 1986848
In[15]:= MemoryInUse[]
Out[15]= 1986824
In[16]:= MemoryInUse[]
Out[16]= 1986824
One can see that defining a=2;
increases MemoryInUse[]
by 1986824-1986760=64 bytes. Replacing it with the definition a:=2;
increases MemoryInUse[]
by 1986952-1986824=128 bytes. And replacing the latter definition with the former reverts MemoryInUse[]
to 1986824 bytes. It means that delayed definitions require 128 bytes more than immediate definitions.
Of course this experiment does not prove my hypothesis.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With