Regression is the backbone of predictive modeling in machine learning. In this blog, we will dive deep into regression techniquesโsimple linear, multi-linear, and logistic regressionโand learn how to create our own predictive models from scratch. ๐ง โจ
๐ What you will learn:
๐ Let's start with Linear Regression!
Before we dive into calculations, let's understand what regression really means. In simple terms, regression helps machines learn from given data so that they can predict future values based on past trends. ๐
Linear regression is a technique that predicts continuous values using a straight-line equation:
๐ Mathematical equation of a straight line:
Where:
Let's break it down with an example. ๐
๐ Years of Experience (X) | ๐ฐ Salary (Y) (in $1000) |
---|---|
1 | 35 |
2 | 40 |
3 | 45 |
4 | 50 |
5 | 55 |
๐ Our goal: Find a pattern so that given a new input (years of experience), we can predict the salary.
We assume the relationship follows:
where represents salary, represents years of experience, is the slope, and is the intercept.
To generalize for multiple data points, we sum up the equations for all data points:
๐ Summing the original equation over all data points:
๐ Multiplying each equation by and summing:
Now, these two equations contain summations that can be computed for any dataset. ๐งฎ
Rearranging the equations:
โ With these formulas, we can compute and for any dataset, whether small or large!
๐ Once we have and , we can use the equation y = mx + c to predict salary for new experience values.
Now that we've built an understanding of simple linear regression, we will explore multi-linear regression and logistic regression in the next sections. Stay tuned! ๐ฅ๐