I have a 5x600x16 array, an example of a smaller similar array is shown below. I need to normalize the values column-wise in each slice (out of the 5 in total).
tensor([[[9.9771e-01, 6.6219e-02, 8.6409e-03, 1.1918e-05, 2.3837e-05],
[9.9771e-01, 6.6219e-02, 8.6409e-03, 1.1918e-05, 2.3837e-05]],
[[9.9525e-01, 9.6969e-02, 7.5091e-03, 1.0301e-05, 3.0902e-05],
[9.9802e-01, 6.2234e-02, 7.8646e-04, 2.0696e-05, 1.0348e-05]],
[[9.7093e-01, 2.3617e-01, 3.2587e-02, 0.0000e+00, 0.0000e+00],
[9.7418e-01, 2.2391e-01, 5.7788e-03, 6.0829e-05, 9.1244e-05]],
[[9.9781e-01, 6.4524e-02, 1.8817e-03, 1.8268e-05, 0.0000e+00],
[9.9153e-01, 1.2825e-01, 1.0527e-02, 0.0000e+00, 3.8630e-05]]])
For the purposes of this question let's consider the array
a = np.array([[[10, 100, 1], [5, 50, .5]], [[10, 1000, 10], [10, 1, 20]]])
I have tried using normalize from PyTorch, without success
>>>f.normalize(torch.from_numpy(a), p=2, dim=2)
tensor([[[0.0995, 0.9950, 0.0099],
[0.0995, 0.9950, 0.0099]],
[[0.0100, 0.9999, 0.0100],
[0.4468, 0.0447, 0.8935]]], dtype=torch.float64)
and a simple function that I created, with a bit more success
def normalize(data):
return (data - data.mean()) / (data.max() - data.min())
where I pass each a[...] slice and then stack the results together again.
Is there a better way to properly normalize my data in the way I described?
try this:
import pandas as pd
x =[[[9.9771e-01, 6.6219e-02, 8.6409e-03, 1.1918e-05, 2.3837e-05],
[9.9771e-01, 6.6219e-02, 8.6409e-03, 1.1918e-05, 2.3837e-05]],
[[9.9525e-01, 9.6969e-02, 7.5091e-03, 1.0301e-05, 3.0902e-05],
[9.9802e-01, 6.2234e-02, 7.8646e-04, 2.0696e-05, 1.0348e-05]],
[[9.7093e-01, 2.3617e-01, 3.2587e-02, 0.0000e+00, 0.0000e+00],
[9.7418e-01, 2.2391e-01, 5.7788e-03, 6.0829e-05, 9.1244e-05]],
[[9.9781e-01, 6.4524e-02, 1.8817e-03, 1.8268e-05, 0.0000e+00],
[9.9153e-01, 1.2825e-01, 1.0527e-02, 0.0000e+00, 3.8630e-05]]]
for b in x:
df = pd.DataFrame(b).transpose()
normalized_df=(df-df.min())/(df.max()-df.min())
print(normalized_df)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With