Let's say I have a variable filename = "/home/jimmy/logger.log"
. I want to have some markdown documentation like this:
md"""
The output is logged at `$(filename)`
"""
There are two work arounds that I know of:
first:
Markdown.parse("The output is logged at `" * filename * "`")
second:
str = "The output is logged at `$(filename)`"
md"$str"
Is there a way to directly tell the markdown macro to do the string interpolation before the backtick parsing?
This was puzzling to figure out, but I think I've got it. No, there is no way to get the @md_str
macro to do string interpolation first, but I have a workaround below.
It took me a while to realize that when an @f_str
or @f_cmd
macro is called using the f"arg"
or f`arg`
syntax, it receives the raw string raw"arg"
as the argument rather than the expression:("arg")
that a macro would otherwise get:
julia> macro showarg_str(x) show(x) end;
julia> showarg"text$var"
"text\$var"
julia> @showarg_str("text$var")
:("text$(var)")
This behavior is defined in the julia-paser.scm
parser.
But code like md"$expr"
shows the evaluated expr
and not the raw string raw"$expr"
! What's going on? It turns out $expr
gets parsed into an expression during the parsing into Markdown via custom code in the Markdown module; this is largely implemented in interp.jl
. According to the documentation this is intentional and done so that the Markdown trees store expressions instead of the strings from evaluating these expressions. In theory, this facilitates advanced custom features. In practice, this gave me a metaphorical headache.
Now, using Markdown.parse
would appear to work as you desire. Because as a regular function and not a macro, Markdown.parse
receives the interpolated string, not the raw string or the expression:
julia> Markdown.parse("The output is logged at `$(filename)`") |> show
The output is logged at `/home/jimmy/logger.log`
However, because $
processing occurs again in Markdown.parse
, there are undesirable/unintuitive behaviors like this:
julia> Markdown.parse(raw"$filename") |> show
:filename
The following is not documented, but the best way I see to change this behavior is to define and customize a new Markdown flavor in Markdown.flavors
, then call Markdown.parse
with this flavor as an argument. In the flavor's config, we just need to stop blockinterp
and interp
from getting called during parsing. This can be done like so:
let
config = deepcopy(Markdown.flavors[:julia])
filter!(!=(Markdown.blockinterp), config.regular)
filter!(!=(Markdown.interp), get(config.inner, '$', []))
Markdown.flavors[:julia_nointerp] = config
end
To allow the Markdown module to be loaded after rather than before this configuring, we can put this into a convenience function instead:
function md(str, flavor=:julia_nointerp)
@isdefined(Markdown) || error("the Markdown module must be loaded. Try `using Markdown`.")
if flavor == :julia_nointerp && !haskey(Markdown.flavors, :julia_nointerp)
config = deepcopy(Markdown.flavors[:julia])
filter!(!=(Markdown.blockinterp), config.regular)
filter!(!=(Markdown.interp), get(config.inner, '$', []))
Markdown.flavors[:julia_nointerp] = config
end
return Markdown.parse(str; flavor=flavor)
end
Now you can call md("markdown code")
to get Markdown parsing with standard string interpolation instead of the special $
parsing that md"markdown code"
does:
julia> filename = "/home/jimmy/logger.log";
julia> md("The output is logged at `$(filename)`") |> show
The output is logged at `/home/jimmy/logger.log`
julia> md("The output is logged at `$(filename)`") |> typeof
Markdown.MD
julia> md("The output is logged at `$(filename)`").content
1-element Array{Any,1}:
Markdown.Paragraph(Any["The output is logged at ", Markdown.Code("", "/home/jimmy/logger.log")])
P.S.: I don't recommend turning md
into a macro or trying to modify the @md_str
macro because: (1) a function suffices, and (2) using a function means the user can be certain that standard string interpolation will occur before the argument is passed to md
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With