NPTEL Journey For Deep Learning Course Certification

Hey Beginner!

I recently completed the NPTEL course “Deep Learning” by IIT Ropar and I am thrilled to share my experience with you all. This 12-week journey has been incredibly rewarding and has deepened my understanding of one of the most transformative technologies in the modern world. I am proud to say that I cleared the final exam and earned an overall score of 65, securing an Elite Certificate from NPTEL.

Preparation Phase:

I enrolled in the course in the middle of January 2024, with classes beginning in January 2024. To get a head start, I previewed the course content available on YouTube, which included around 118 videos.

Starting in January 2024, NPTEL would unlock each week’s lectures gradually. I maintained a notebook from the start, diligently taking notes for future reference. Understanding the importance of the weekly assignments, I dedicated myself to completing them thoroughly, knowing that the top 8 would significantly impact my final score. Additionally, I used online resources like w3wiki to supplement my learning and clarify complex concepts. I attended the Online Live Doubt sessions from NPTEL every Saturday to get my doubts cleared.

Weekly Course Layout & Learning Journey:

The course was meticulously structured into weekly modules, each focusing on a different aspect of Deep Learning. Here’s a brief overview of what I learned each week:

  • Week 1: I began with a fascinating overview of the history of deep learning, exploring its success stories and fundamental concepts like the McCulloch Pitts Neuron and the Perceptron Learning Algorithm. This introduction set a solid foundation for the weeks to come. I practised coding simple neural networks to solidify these concepts.
  • Week 2: This week, I delved into Multilayer Perceptrons (MLPs), understanding their representation power and learning about sigmoid neurons and gradient descent. The practical exercises on feedforward neural networks were particularly enlightening. I implemented MLPs in Python and experimented with different activation functions.
  • Week 3: The focus was on FeedForward Neural Networks and Backpropagation this week. I spent a lot of time practising backpropagation algorithms, which are crucial for training deep networks effectively. I wrote custom backpropagation code to gain a deeper understanding of the learning process.
  • Week 4: I explored various optimization techniques, including Gradient Descent (GD), Momentum GD, Nesterov Accelerated GD, Stochastic GD, AdaGrad, RMSProp, and Adam. Additionally, learning about eigenvalues, eigenvectors, and eigenvalue decomposition was quite enriching. I compared the performance of different optimization algorithms on a set of neural network models.
  • Week 5: This week, I covered Principal Component Analysis (PCA) and Singular Value Decomposition (SVD). The interpretation of PCA and its applications in reducing dimensionality were particularly useful. I applied PCA to a dataset to visualize the reduction in dimensions and its impact on model performance.
  • Week 6: I learned about different types of autoencoders and their relation to PCA. The hands-on sessions with regularization techniques in autoencoders, such as denoising and sparse autoencoders, were very practical. I built and trained autoencoders on image datasets to observe their reconstruction capabilities.
  • Week 7: Regularization techniques were the focus this week. I learned about the Bias-Variance Tradeoff, L2 regularization, early stopping, Dataset augmentation and dropout. Implementing these techniques helped me understand how to improve model generalization. I applied dropout and data augmentation to my existing models to see the improvements in generalization.
  • Week 8: I covered advanced topics like Greedy Layerwise Pre-training, better activation functions, improved weight initialization methods, and batch normalization for this week. These topics are essential for building deeper and more efficient neural networks. I experimented with different activation functions and initialization methods to enhance model performance.
  • Week 9: In this week, Learning Vectorial Representations of Words was the highlight for me. This week was pivotal in understanding how deep learning models handle natural language processing tasks. I implemented word embeddings and used them in simple text classification tasks to understand their impact.
  • Week 10: I dove into Convolutional Neural Networks (CNNs), studying architectures like LeNet, AlexNet, ZF-Net, VGGNet, GoogLeNet, and ResNet. The practical insights into visualizing CNNs through guided backpropagation, Deep Dream, and Deep Art were particularly fascinating. I built and trained my own CNN models on image classification tasks. I incorporated CNN and YoloV8 and made Vehicle License Plate Recognition with an accuracy of 80%.
  • Week 11: This week, I focused on Recurrent Neural Networks (RNNs) and backpropagation through time (BPTT). Understanding GRU and LSTMs helped me grasp how deep-learning models process sequential data. I created RNN and LSTM models to analyze time-series data and text sequences.
  • Week 12: The final week covered Encoder-Decoder Models and the Attention Mechanism, including its application over images. These concepts are fundamental for advanced tasks in machine translation and image captioning. I implemented a simple attention mechanism in a sequence-to-sequence model to see how it improves translation accuracy.

The Exam Day & Exam Experience:

28th April 2024, was the D-Day marked on my calendar with a mix of anticipation and anxiety. The admit card was in hand, and the examination venue was selected with care. The three-hour exam consisted of 50 questions, blending assignment-based problems and conceptual queries. Each Question carried 2 Marks. The strict invigilation at the centre ensured a fair testing environment. Despite facing some challenging questions, I approached each with determination, drawing on the solid foundation built through weekly assignments and intensive preparation. The best part was there was no Negative Marking, so you can guess some questions. Completing the exam was a moment of triumph, knowing that regardless of the results, I had gained invaluable knowledge and skills.

My 12 Week Course Experience:

Throughout this course, I was continually challenged and engaged. The weekly assignments were rigorous, ensuring that I applied the theoretical concepts practically. Participating in the discussion forums and attending NPTEL Live Doubt sessions was immensely beneficial, as it allowed me to clarify doubts and learn from peers. Lastly, the final certification exam tested my comprehensive knowledge of the entire course.

One of the most exciting aspects of this course was the hands-on experience. I built various deep-learning models, experimented with different architectures, and used optimization algorithms to enhance model performance. This practical exposure was invaluable and significantly boosted my confidence in implementing deep-learning solutions.

Criteria to get an NPTEL Course Completion Certificate:

Average assignment score = 25% of the average of the best 8 assignments out of the total 12 assignments given in the course.

  • Exam score = 75% of the proctored certification exam score out of 100
  • Final score = Average assignment score + Exam score

YOU WILL BE ELIGIBLE FOR A CERTIFICATE ONLY IF THE THE AVERAGE ASSIGNMENT SCORE >=10/25 AND EXAM SCORE >= 30/75. If one of the 2 criteria is not met, you will not get the certificate even if the Final score >= 40/100.

Conclusion:

Reflecting on my journey through the NPTEL “Deep Learning” course by IIT Ropar, I am incredibly grateful for the experience. It has equipped me with the knowledge and skills to tackle complex problems in Computer Vision and Natural Language Processing. The Elite Certificate I earned is a testament to the hard work and dedication I put into this course.

I highly recommend this course to anyone interested in deep learning. The structured approach, comprehensive content, and practical assignments make it an excellent learning experience. Enroll in this course if you can; I am sure you won’t regret it!

Happy Learning!