To feed my generative neural net, I need to normalize some data between -1 and 1.
I do it with MinMaxScaler
from Sklearn and it works great.
Now, my generator is going to output data between -1 and 1.
How to revert MinMaxScaler
to get real data ?
In the present post, I will explain the second most famous normalization method i.e. Min-Max Scaling using scikit-learn (function name: MinMaxScaler ). Another way to normalize the input features/variables (apart from the standardization that scales the features so that they have μ=0 and σ=1) is the Min-Max scaler.
sklearn.preprocessing.MinMaxScaler class sklearn.preprocessing.MinMaxScaler (feature_range= (0, 1), copy=True) [source] Transforms features by scaling each feature to a given range. This estimator scales and translates each feature individually such that it is in the given range on the training set, i.e. between zero and one.
Minmaxscaler is the Python object from the Scikit-learn library that is used for normalising our data. You can learn what Scikit-Learn is here. Normalisation is a feature scaling technique that puts our variable values inside a defined range (like 0-1) so that they all have the same range.
How to reverse the data scaling applied to a variable with scikit learn in python ? To reverse the data scaling applied to a variable with scikit learn in python, a solution is to use inverse_transform (), example Pour normaliser les données on peut utiliser le module scikit-learn preprocessing avec StandardScaler:
Let us start by defining a pandas dataframe:
cols = ['A', 'B']
data = pd.DataFrame(np.array([[2,3],[1.02,1.2],[0.5,0.3]]),columns=cols)
The we scale the data using the MinMaxScaler
scaler = preprocessing.MinMaxScaler(feature_range = (0,1))
scaled_data = scaler.fit_transform(data[cols])
Now, to invert the transformation you should call the inverse transform:
scaler.inverse_transform(scaled_data)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With