As the Documentation says, "DumpSave
writes out definitions in a binary format that is optimized for input by Mathematica." Is there a way to convert a Mathematica binary dump file back to the list of definitions without evaluating them? Import["file.mx","HeldExpression"]
does not work...
DumpSave
stores values associated with the symbol, i.e. OwnValues
, DownValues
, UpValues
, SubValues
, DefaultValues
, NValues
, FormatValues
.
All the evaluation was done in the session on Mathematica, and then DumpSave
saved the result of it.
These values are stored in internal formal. Reading the MX files only creates symbols and populates them with these values by reading this internal format back, bypassing the evaluator.
Maybe you could share the problem that prompted you to ask this question.
f[x_Real] := x^2 + 1
DumpSave[FileNameJoin[{$HomeDirectory, "Desktop", "set_delayed.mx"}],
f];
Remove[f]
f[x_Real] = x^2 + 1;
DumpSave[FileNameJoin[{$HomeDirectory, "Desktop", "set.mx"}], f];
setBytes =
Import[FileNameJoin[{$HomeDirectory, "Desktop", "set.mx"}], "Byte"];
setDelayedBytes =
Import[FileNameJoin[{$HomeDirectory, "Desktop", "set_delayed.mx"}],
"Byte"];
One can, then, use SequenceAlignment[setBytes, setDelayedBytes]
to see the difference. I do not know why it is done that way, but my point stands. All the evaluation on values constructed using Set
has already been done in Mathematica session before they were saved by DumpSave
. When MX file is read the internal representation is read back into Mathematica sessions, and no evaluation of loaded definitions is actually performed.
You can assign Rule
s instead of RuleDelayed
's to DownValues, which is equivalent to the immediate definitions. The right-hand side of the assignment stays unevaluated and is copied literally, so the command corresponding to
Clear[f];
f[x_Real] = x^2 + 1;
DumpSave["f.mx", f];
Clear[f];
f = a;
<< f.mx;
Definition[f]
would be
Clear[f];
f = a;
DownValues[f] := {f[x_Real] -> x^2 + 1}
Definition[f]
f = a
f[x_Real] = x^2+1
(cf. with your example of Clear[f]; f = a; f[x_Real] = x^2 + 1; Definition[f]
which does not work, assigning a rule for a[x_Real]
instead). This is robust to prior assignments to x
as well.
Edit: It is not robust to side effects of the right-hand side, as an example in the comments below shows. To assign a downvalue avoiding any evaluation one can use the undocumented System`Private`ValueList
like in the following:
Clear[f];
f := Print["f is evaluated!"];
DownValues[f] := System`Private`ValueList[f[x_Real] -> Print["definition is evaluated!"]];
(no output)
Note that the assignment got seemingly converted to delayed rules:
DownValues[f]
{HoldPattern[f[x_Real]] :> x^2 + 1}
but Definition
(and Save
) show that the distinction from a :=
has internally been kept. I don't know why DownValues
don't display the truth.
To answer the original question, you would probably do best with importing the dump file and exporting the relevant symbols using Save
, then, if expecting this to be loaded into a kernel tainted by prior definitions, convert the assignments into assignments to DownValues
as above programatically. It might be easier to scope the variables in a private context before the export, though, which is what the system files do to prevent collisions.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With