Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Linear regression with matplotlib / numpy

People also ask

Is Polyfit linear regression?

Using polyfit , you can fit second, third, etc… degree polynomials to your dataset, too. (That's not called linear regression anymore — but polynomial regression.


arange generates lists (well, numpy arrays); type help(np.arange) for the details. You don't need to call it on existing lists.

>>> x = [1,2,3,4]
>>> y = [3,5,7,9] 
>>> 
>>> m,b = np.polyfit(x, y, 1)
>>> m
2.0000000000000009
>>> b
0.99999999999999833

I should add that I tend to use poly1d here rather than write out "m*x+b" and the higher-order equivalents, so my version of your code would look something like this:

import numpy as np
import matplotlib.pyplot as plt

x = [1,2,3,4]
y = [3,5,7,10] # 10, not 9, so the fit isn't perfect

coef = np.polyfit(x,y,1)
poly1d_fn = np.poly1d(coef) 
# poly1d_fn is now a function which takes in x and returns an estimate for y

plt.plot(x,y, 'yo', x, poly1d_fn(x), '--k') #'--k'=black dashed line, 'yo' = yellow circle marker

plt.xlim(0, 5)
plt.ylim(0, 12)

enter image description here


This code:

from scipy.stats import linregress

linregress(x,y) #x and y are arrays or lists.

gives out a list with the following:

slope : float
slope of the regression line
intercept : float
intercept of the regression line
r-value : float
correlation coefficient
p-value : float
two-sided p-value for a hypothesis test whose null hypothesis is that the slope is zero
stderr : float
Standard error of the estimate

Source


import numpy as np
import matplotlib.pyplot as plt 
from scipy import stats

x = np.array([1.5,2,2.5,3,3.5,4,4.5,5,5.5,6])
y = np.array([10.35,12.3,13,14.0,16,17,18.2,20,20.7,22.5])
gradient, intercept, r_value, p_value, std_err = stats.linregress(x,y)
mn=np.min(x)
mx=np.max(x)
x1=np.linspace(mn,mx,500)
y1=gradient*x1+intercept
plt.plot(x,y,'ob')
plt.plot(x1,y1,'-r')
plt.show()

USe this ..


from pylab import * 

import numpy as np
x1 = arange(data) #for example this is a list
y1 = arange(data) #for example this is a list 
x=np.array(x) #this will convert a list in to an array
y=np.array(y)
m,b = polyfit(x, y, 1) 

plot(x, y, 'yo', x, m*x+b, '--k') 
show()

Another quick and dirty answer is that you can just convert your list to an array using:

import numpy as np
arr = np.asarray(listname)

Linear Regression is a good example for start to Artificial Intelligence

Here is a good example for Machine Learning Algorithm of Multiple Linear Regression using Python:

##### Predicting House Prices Using Multiple Linear Regression - @Y_T_Akademi
    
#### In this project we are gonna see how machine learning algorithms help us predict house prices. Linear Regression is a model of predicting new future data by using the existing correlation between the old data. Here, machine learning helps us identify this relationship between feature data and output, so we can predict future values.

import pandas as pd

##### we use sklearn library in many machine learning calculations..

from sklearn import linear_model

##### we import out dataset: housepricesdataset.csv

df = pd.read_csv("housepricesdataset.csv",sep = ";")

##### The following is our feature set:
##### The following is the output(result) data:
##### we define a linear regression model here: 

reg = linear_model.LinearRegression()
reg.fit(df[['area', 'roomcount', 'buildingage']], df['price'])

# Since our model is ready, we can make predictions now:
# lets predict a house with 230 square meters, 4 rooms and 10 years old building..

reg.predict([[230,4,10]])

# Now lets predict a house with 230 square meters, 6 rooms and 0 years old building - its new building..
reg.predict([[230,6,0]])

# Now lets predict a house with 355 square meters, 3 rooms and 20 years old building 
reg.predict([[355,3,20]])

# You can make as many prediction as you want.. 
reg.predict([[230,4,10], [230,6,0], [355,3,20], [275, 5, 17]])

And my dataset is below:

enter image description here