Linear Regression Using Tensorflow:

Prerequisites in Python for Data Science:

We have discussed linear regression in the previous article, and now we will discuss tensor flow in this article.

Tensorflow:

It is an open-source computational library made by Google. Also, it is the most chosen for the creation of applications which require high end numerical computations and utilizing Graphics Processing Unit for computational purposes. It is one of the main reasons Tensorflow is one of the most popular choices for Machine learning and deep learning. It also consists of APIs like estimator which will provide a high level of abstraction while building Machine learning applications. Here we will be building Linear Regression model by using low level Tensorflow in lazy execution mode. During this mode, Tensorflow will create a Directed Acyclic Graph or DAG. This keeps track of all computations and executes all computations done inside a Tensorflow session.

Implementation:

In the starting, we have to import the required libraries. We use numpy with Tensorflow for computation and Matplotlib for plotting.

Linear Regression Using Tensorflow in Python for Data Science - PST

For making random numbers predictable, we define fixed seeds for numpy and tensorflow.

Now we will train the linear regression model by generating some random data.

Linear Regression Using Tensorflow in Python for Data Science - PST

Now we will visualize the training data.

Linear Regression Using Tensorflow in Python for Data Science - PST

OUTPUT:

Linear Regression Using Tensorflow in Python for Data Science - PST

We will now create our model by defining placeholders x and y in order to feed the training examples x and y into optimizer during the training process.

Now, we will declare two trainable Tensorflow variables for weights and bias and the initializing them randomly using np.random.randn().

Linear Regression Using Tensorflow in Python for Data Science - PST

We will now define the hyperparameters of the model, the learning rate, and the number of epochs.

We will now build the hypothesis, cost function, and the optimizer. Also, we won’t implement the gradient descent optimizer manually as it is built inside tensorflow. Later we will initialize the variables.

Linear Regression Using Tensorflow in Python for Data Science - PST

Now, we will begin training process inside a tensorflow session.

Linear Regression Using Tensorflow in Python for Data Science - PST

Now we will look at the returned result.

In both cases Weight and bias are scalars, the reason being only one dependent value is considered in our training data. In case we have n dependent variables in our training dataset, the weight will be an n-dimensional vector while the bias will be a scalar.

When we plot the result we get:

OUTPUT:

So, to learn more about it in python for data science, you can check this and this as well.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.