Polynomial regression is a form of the linear regression. In python, for data science, it shows a relationship between the independent variable and dependent variable is modeled as nth degree polynomial. It fits a non liner relationship between value of independent variable and dependent variable denoted as E (y|x).

**Why we use Polynomial regression?**

- Some relationships that a researcher hypothesizes is curvilinear. So, in such types of cases, we will include a polynomial term.
- We also need it for the inspection of residuals. In case we want to fit linear model to curved data, a scatter plot of residuals (Y-axis) on the predictor (X-axis) will have patches of positive residuals in the middle. So, it is not appropriate in such situations.
- In multiple linear regression analysis, we assume that all independent variables are independent. So, in the case of polynomial regression model, we do not satisfy this assumption.

**Uses of Polynomial regression:**

These are used for defining or describing non-linear phenomenon such as:

- Growth rate of tissues
- Progression of disease epidemics
- Distribution of carbon isotopes in lake sediments

The primary goal of regression analysis is modeling the expected value of dependent variable y in terms of value of an independent variable x. So, we will use the following equation in simple regression.

In this y is dependent variable, a is the intercept of y, b is the slope and e is the error rate.

This linear model does not work in many cases. So, in such cases, we will use quadratic model

It is generally modeled for nth value.

As regression function is linear in terms of unknown variables, so, these models are linear from the point of estimation.

So, we will use Least square technique for computation of response value i.e., y.

**Polynomial regression:**

**Step 1: **Importing libraries and dataset

OUTPUT:

**Step 2: **Dividing the data set into 2 components

We will divide the data set into two components ( x and y). So, X will contain the column between 1 and 2 and y will contain the column 2.

**Step 3: **Fitting of linear regression to the dataset

**Step 4:** Fitting of polynomial regression to the dataset

Fitting of the polynomial regression model on the two components X and y.

**Step 5: **Here we will visualize the linear regression results using a scatter plot.

OUTPUT:

**Step 6: **Visualization of polynomial regression results using a scatter plot.

OUTPUT:

**Step 7: **Prediction of new results with both linear and polynomial regression.

**Advantages of using Polynomial regression:**

- We can fit a broad range of functions under it.
- Polynomial can fit a wide range of curvature.
- Polynomials can provide the best approximation of relationships between dependent and independent variables.

**Disadvantages of using Polynomial regression:**

- The polynomial regressions are too sensitive to the outliers.
- The presence of one or two outliers in data can seriously affect results of non-linear analysis.
- There are very few model validation tools for the detection of the outliers in non-linear regression than there are for linear regression.

So, to learn more about it in python for data science, you can check this and this as well.