Hello, welcome to my blog. In my previous posts I have talked
extensively about linear regression and how it can be implemented in Python.
Now, I want to talk about another popular technique in Machine Learning –
Nearest Neighbours.
Monday, 29 February 2016
Tuesday, 23 February 2016
POLYNOMIAL REGRESSION
Hello, welcome to my blog. I introduced the concept of linear
regression in my previous posts by giving the basic intuition behind it and showing
how it can be implemented in Python. In the last post, I gave a precaution to
observe when applying linear regression to a problem – Make sure the relationship between
the dependent and independent variable is LINEAR i.e. it can be fitted with a
straight line.
So, what do we do if a straight line cannot define the
relationship between the two variables we are working with? Polynomial regression helps to solve
this problem.
Sunday, 14 February 2016
LINEAR REGRESSION ROUNDUP
Hello, welcome to my blog. In my previous posts, I have been
talking about linear regression which is a technique used to find the
relationship between one or more explanatory variables (also called independent
variable) and a response variable (also called dependent variable) using a
straight line. Furthermore, I said that when we have more than one explanatory
variable it is called multiple linear
regression. Finally, I also implemented both types of regression using Python.
As a roundup I will just mention some precautions that should
be taken when applying linear regression. Here are some tips to remember:
Monday, 8 February 2016
IMPLEMENTING MULTIPLE LINEAR REGRESSION USING PYTHON
Hello, welcome to my blog. In this post I will introduce the
concept of multiple linear regression. First, let me do a brief recap. In the
last two posts, I introduced the concept of regression which basically is a
machine learning tool used to find the relationship between an explanatory (also called predictor, independent) variable and a response (or dependent) variable by modelling the relationship using the
equation of a line i.e.
y = a + bx
Where a is the
intercept, b is the slope and y is our prediction.
Up until now we have sort of used only one explanatory
variable to predict the response variable. This really is not very accurate
because if (for example) you are trying to predict the price of a house the
square footage of the house is not the only feature that determines it price.
Other attributes like number of bedrooms, bathrooms, location and many other
features will contribute to the final price of the house.
Subscribe to:
Posts (Atom)