I have an X array with NaN and I can remove the row with NaN as such:
import numpy as np
x = x[~np.isnan(x)]
But I have a corresponding Y array
assert len(x) == len(y) # True
x = x[~np.isnan(x)]
assert len(x) == len(y) # False and breaks
How do I remove the corresponding rows from the Y array?
My X array looks like this:
>>> x
[[ 2.67510434 2.67521927 3.49296989 3.80100625 4. 2.83631844]
[ 3.47538057 3.4752436 3.62245715 4.0720535 5. 3.7773169 ]
[ 2.6157049 2.61583852 3.48335887 3.78088813 0. 2.78791096]
...,
[ 3.60408952 3.60391203 3.64328267 4.1156462 5. 3.77933333]
[ 2.66773792 2.66785516 3.49177798 3.7985113 4. 2.83631844]
[ 3.26622238 3.26615124 3.58861468 4.00121327 5. 3.49693169]]
But something weird is going on:
indexes = ~np.isnan(x)
print indexes
[out]:
[[ True True True True True True]
[ True True True True True True]
[ True True True True True True]
...,
[ True True True True True True]
[ True True True True True True]
[ True True True True True True]]
You are getting rid of items which are NaN, not of rows with NaN. The proper thing to do would be:
mask = ~np.any(np.isnan(x), axis=1)
x = x[mask]
y = y[mask]
To see the different behavior of both approaches:
>>> x = np.random.rand(4, 5)
>>> x[[0, 2], [1, 4]] = np.nan
>>> x
array([[ 0.37499461, nan, 0.51254549, 0.5253203 , 0.3955948 ],
[ 0.73817831, 0.70381481, 0.45222295, 0.68540433, 0.76113544],
[ 0.1651173 , 0.41594257, 0.66327842, 0.86836192, nan],
[ 0.70538764, 0.31702821, 0.04876226, 0.53867849, 0.58784935]])
>>> x[~np.isnan(x)] # 1D array with NaNs removed
array([ 0.37499461, 0.51254549, 0.5253203 , 0.3955948 , 0.73817831,
0.70381481, 0.45222295, 0.68540433, 0.76113544, 0.1651173 ,
0.41594257, 0.66327842, 0.86836192, 0.70538764, 0.31702821,
0.04876226, 0.53867849, 0.58784935])
>>> x[~np.any(np.isnan(x), axis=1)] # 2D array with rows with NaN removed
array([[ 0.73817831, 0.70381481, 0.45222295, 0.68540433, 0.76113544],
[ 0.70538764, 0.31702821, 0.04876226, 0.53867849, 0.58784935]]
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With