I'm trying out scikit-learn LinearRegression model on a simple dataset (comes from Andrew NG coursera course, I doesn't really matter, look the plot for reference)
this is my script
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import LinearRegression
dataset = np.loadtxt('../mlclass-ex1-008/mlclass-ex1/ex1data1.txt', delimiter=',')
X = dataset[:, 0]
Y = dataset[:, 1]
plt.figure()
plt.ylabel('Profit in $10,000s')
plt.xlabel('Population of City in 10,000s')
plt.grid()
plt.plot(X, Y, 'rx')
model = LinearRegression()
model.fit(X[:, np.newaxis], Y)
plt.plot(X, model.predict(X[:, np.newaxis]), color='blue', linewidth=3)
print('Coefficients: \n', model.coef_)
plt.show()
my question is: I expect to have 2 coefficient for this linear model: the intercept term and the x coefficient, how comes I just get one?
OOOPS
I didn't notice that the intercept is a separated attribute of the model!
print('Intercept: \n', model.intercept_)
look documentation here
intercept_ : array
Independent term in the linear model.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With