How does it work?
Last updated
Last updated
In this part, we'll write our own back propagation algorithm to train 2 different neural networks or 2 different datasets.
The video I followed while making this part is here: https://youtu.be/ma6hWrU-LaI?si=cgahIMbJrmiUl505
We'll consider the following dataset and we'll train a model by writing our own algorithm.
CGPA | Resume score | Package |
---|---|---|
Below is our simple neural network and we'll be training that only.
for i in range(epochs):
for j in range(X.shape[0]):
Select random row
Predict using forward propagation
Calculate loss (loss function is )
Update weights and biases using GD
Calculate average loss for each epoch
Now in the last part we derived the formula for finding each weight and bias
According to the above neural network, here are all the formulas:
Ufff... there were quite some formulas here, but it's important to understand how they were formed. If you believe you already know how they were derived, feel free to skip it. You don't need to learn it!
Now we'll implement this in jupyter notebook. I have done this in a google colab and here's the link of the colab:
As you can see, we have implemented an algorithm for traning deep learning neural network in python itself. Let's move on to the classification problem.
For classification problem we're going to consider a similar dataset.
But this time, there are two differences:
The activation function we're going to use is Sigmoid function
Rather than using the MSE loss function, this time we're going to use
The algorithm for back propagation in classification problem is same as Algorithm in Regression problem
Now for the regression problem, we already derived it's formula in the What is back propagation?, but we didn't do that for classification problem.
However, you can follow similar approach to derive all the formulas for classification problem using the Binary Cross Entropy loss function.
If you want to see how these are dervied, I stronly suggest you checking out the video.
For now, I'll simply note them down:
I have done the implementation in another google colab. Here's the link:
You might think that we didn't , but we got similar results as Keras. Therefore we can conclude that we have successfully implemented backpropagation for a classification problem with Cross Binary Entropy as our loss function.
That's it for back propagation algorithm. We have seen back propagation in detail. Now if you have to developer a better understanding (intuitively) about back-propagation, I recommend you watching CampusX's "The Why of Back propagation" video.
Otherwise, if you feel confident in this topic, feel free to move on to the next one. Thanks a lot for reading! I'll really appreciate if you have any feedback.
CGPA | Resume Score | Is palced? |
---|---|---|
Byee folks
8
8
4
7
9
5
6
10
6
5
12
7