Implementing Linear Regression and Polynomial Regression
Building Linear Regression
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import LinearRegression
from sklearn.model_selection import train_test_split
# Generate sample data
X = np.random.rand(100, 1) * 10 # Features
y = 2.5 * X + np.random.randn(100, 1) # Targets with noise
# Split the data
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.2, random_state=42)
# Train the model
model = LinearRegression()
model.fit(X_train, y_train)
# Predictions
y_pred = model.predict(X_test)
# Plotting
plt.scatter(X_test, y_test, color='blue')
plt.plot(X_test, y_pred, color='red')
plt.title('Linear Regression')
plt.xlabel('X')
plt.ylabel('y')
plt.show()
Output:
In this example, we generate random data points and apply linear regression to model the relationship between the features (X) and the target variable (y). The plot shows the original data points (in blue) and the fitted linear model (in red).
Building Polynomial Regression
import numpy as np
import matplotlib.pyplot as plt
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import PolynomialFeatures
from sklearn.model_selection import train_test_split
# Generate sample data
X = np.random.rand(100, 1) * 10 # Features
y = 2.5 * X**2 - 1.5 * X + np.random.randn(100, 1) # Targets with noise
# Split the data
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.2, random_state=42)
# Polynomial Features
poly = PolynomialFeatures(degree=2)
X_train_poly = poly.fit_transform(X_train)
X_test_poly = poly.transform(X_test)
# Train the model
model = LinearRegression()
model.fit(X_train_poly, y_train)
# Predictions
y_pred = model.predict(X_test_poly)
# Plotting
plt.scatter(X_test, y_test, color='blue')
plt.scatter(X_test, y_pred, color='red')
plt.title('Polynomial Regression')
plt.xlabel('X')
plt.ylabel('y')
plt.show()
Output:
In this example, we generate random data points with a quadratic relationship and apply polynomial regression to model the relationship. The plot shows the original data points (in blue) and the fitted polynomial model (in red).
Linear vs. Polynomial Regression: Understanding the Differences
Regression analysis is a cornerstone technique in data science and machine learning, used to model the relationship between a dependent variable and one or more independent variables. Among the various types of regression, Linear Regression and Polynomial Regression are two fundamental approaches.
This article delves into the differences between these two methods, their applications, advantages, and limitations.
Table of Content
- What is Linear Regression?
- What is Polynomial Regression?
- Key Differences Between Linear and Polynomial Regression
- Understanding Practical Examples for Linear and Polynomial Regression
- When to Use Linear Regression vs. Polynomial Regression
- Implementing Linear Regression and Polynomial Regression
- Advantages and Disadvantages of Regression Models