Neural networks are the main component of deep learning and helps practically in many different ways. Neural networks are there for image classification, object detection, speech recognition, etc in Python for Data Science.

Single neuron transforms given input into some output. So, depending on inputs and weights assigned to each input, will decide whether the neuron will fire or not. Suppose the neurons have 3 input connections and one output.

We will be using **tanh **activation function in the example below.

The final goal is finding the optimal set of weights for the neuron producing the correct results. We can do this by training the neuron with different training examples. So, we will calculate the error in the output of neuron at each step and back propagate the gradient. Calculation of the output of neuron is known as **forward propagation **and calculation of gradient is known as **back propagation**.

Let us look at the implementation.

OUTPUT:

So, to learn more about it in python for data science, you can check this and this as well.