Linear Regression
Linear
Regression is usually the first algorithm learned in Machine Learning because:
·
It is simple
and easy to understand.
·
It serves as a
foundational algorithm for more advanced techniques.
We
are studying three types of Linear Regression, namely:
1. Simple Linear Regression
2. Multiple Linear Regression
3. Polynomial Linear Regression
· Simple Linear
Regression explains the relationship between one independent variable and
one dependent variable.
· Simple Linear
Regression establishes this relationship by "best-fit" straight
line to the data points.
· The straight
line is defined using the Ordinary Least Squares (OLS) method
Equation / Formula of Simple Linear Regression
where:
- y = dependent variable (response)
- x = independent variable (predictor)
- m = slope of the line.
- c = y-intercept
To find the value of m and c in the linear regression
equation: we use ordinary Least Square (
OLS )Method.
Ordinary Least Squares (OLS)
- Ordinary Least Squares (OLS) method is a statistical technique used to estimate unknown parameters (such as slope and intercept) in a regression model.
- It works by fitting a line (or model) to the given data in such a way that the sum of the squared errors is minimized
- The error is the difference between the actual value and the predicted value for each data point.
By squaring these errors and adding them together, OLS ensures that both positive and negative errors are treated equally and that larger errors are penalized more.
As a result, the OLS method produces the best-fit line that most accurately represents the relationship between the independent and dependent variables.



0 Comments