The convolution function in Julia has the following behaviour:
_
_ _ _(_)_ | A fresh approach to technical computing
(_) | (_) (_) | Documentation: https://docs.julialang.org
_ _ _| |_ __ _ | Type "?help" for help.
| | | | | | |/ _` | |
| | |_| | | | (_| | | Version 0.6.1 (2017-10-24 22:15 UTC)
_/ |\__'_|_|_|\__'_| | Official http://julialang.org/ release
|__/ | x86_64-pc-linux-gnu
julia> conv([1,2,NaN],[1])
3-element Array{Float64,1}:
NaN
NaN
NaN
Is this a bug? Or is this related to the FFT Algorithm used by conv
?
How can I compute a convolution in Julia (not manually for performance reasons) if one of the vectors contains NaN
s? In this example, the result should be:
3-element Array{Float64,1}:
1.0
2.0
NaN
It's because the conv
function in julia uses the FFT.
What do you mean "not manually"? Writing your own convolution function in julia should still be fast.
e.g.
function native_conv{T,V}(u::Array{T,1}, v::Array{V,1})
m = length(u)
n = length(v)
w = zeros(promote_type(T,V), m+n-1)
@inbounds begin
for j in 1:m, k in 1:n
w[j+k-1] += u[j]*v[k]
end;end
return w
end
Or this might be faster (do the conv with FFT with NaNs set to zero then re-normalize).
Edit: it's faster when v
(filter) is large.
function nanconv{T}(u::Array{T,1},v)
nans = find(isnan, u)
u = copy(u)
u[nans] .= zero(T) # set nans2zero
pass1 = conv(u,v) # do convolution
u[u.>0] .= one(T) #
norm = conv(u,v) # get normalizations (also gets rid of edge effects)
u[:] .= one(T) #
norm .= norm./(conv(u,v)) # put edge effects back
w = pass1 ./ norm # normalize
reshaped_nans = reshape_nans(nans,length(v))
w[reshaped_nans] .= NaN # reset Nans
return w
end
function reshape_nans(nans,lv)
out = zeros(Int, length(nans)*lv)
j = 1;
inc = lv - 1
for i in nans
inds = i:(i+inc)
out[j:j+inc] = inds
j += inc + 1
end
out
end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With