-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nearest-neighbor inpainting? #10
Comments
Hey! Good to hear from you! Hope you're doing well 😄 Not sure if it was related to that issue, I just regularly used the MATLAB version and thought I could just translate it to Julia. I think your suggestion makes sense, but I have very little free time to play with this unfortunately... Feel free to add any method you like in a PR and I will gladly merge it. |
I am! Hope you've been doing well too! Glad to hear nearest-neighbor inpainting is within scope! I don't think it's implemented in any Julia package so I'm still trying to figure out whether to use Interpolations.jl (probably can't), NearestNeighbors.jl (might be too slow?), or Inpaintings.jl. Or just write something from scratch. I'll have a look at the code here and see if I contribute a solution. Just need to figure out how to accomplish nearest-neighbor inpainting! |
FWIW I've used NearestNeighbours.jl and found it worked fast enough for me. |
Here's what I ended up coming up with for 1D and 2D arrays but I found it very slow for big arrays with millions of elements with a significant fraction of NaNs. I guess it's the using NearestNeighbors
skip_nan_data(I) = isnan(data[I])
function nearest_neighbor_inpaint_1d(data, x₁)
N = length(x₁)
points = reshape(x₁, (1, N))
new_data = similar(data)
tree = KDTree(points)
for i in 1:N
if isnan(data[i])
x★ = x₁[i]
i_nn = nn(tree, [x★], skip_nan_data)[1]
new_data[i] = data[i_nn]
else
new_data[i] = data[i]
end
end
return new_data
end
function nearest_neighbor_inpaint_2d(data, x₁, x₂)
N₁, N₂ = size(data)
N = N₁ * N₂
points = zeros(2, N)
for (n, I) in enumerate(CartesianIndices(data))
i, j = Tuple(I)
points[1, n] = x₁[i]
points[2, n] = x₂[j]
end
new_data = similar(data)
tree = KDTree(points)
for I in CartesianIndices(data)
i, j = Tuple(I)
if isnan(data[I])
x₁★ = x₁[i]
x₂★ = x₂[j]
n_nn = nn(tree, [x₁★, x₂★], skip_nan_data)[1]
I_nn = CartesianIndices(data)[n_nn]
new_data[I] = data[I_nn]
else
new_data[I] = data[I]
end
end
return new_data
end |
I don't have time to dive into this so I'm not sure what to say! 🤷♂️ But I just remembered I used something like this in my wheel-reinventing OceanGrids.jl (link): function Interpolations.interpolate(x, g::T) where {T<:OceanCurvilinearGrid}
# This is weird: I am overloading Interpolations.interpolate with a NearestNeighbors function,
# but it's to keep things working. TODO: Think of a better way to do all this
brutetree = BruteTree([lonvec(g)'; latvec(g)'; depthvec(g)'])
return (y,x,z) -> nn(brutetree, [x'; y'; z'])[1]
end and maybe (maybe!) the "vectorized" form helps with performance. Maybe that is useful to you? (As you can tell by the comments I wasn't entirely satisfied with it, but for reasons related to the odd Interpolations.jl/NearestNeighbors.jl mix) |
No worries, I wasn't expecting us to do anything with it! Just sharing my code in case someone stumbling upon this issue finds it useful. Thanks for sharing your OceanGrids.jl implementation! Yeah I ended up just doing a quick and dirty flood fill instead of a more proper nearest neighbor interpolation which did the job, but I might revisit this problem in the future. |
Hey @briochemc, thank you for developing this package! Not sure if it came out of JuliaMath/Interpolations.jl#330 but feels like exactly what I was looking for.
I actually didn't know about inpainting before I came across this repo. I know in issue #5 you have limited the scope of this package to certain useful methods.
Would nearest-neighbor interpolation/extrapolation be considered inpainting and be within scope for this package? I guess it doesn't really involve evolving a diffusion equation so maybe not?
The text was updated successfully, but these errors were encountered: