- 📚 Godoy, Daniel. Deep Learning with PyTorch - Step by Step. [Link]
- 📚 Tam, Adrian. Deep Learning with PyTorch. [Link]
- 📚 Huyen, Chip. Designing Machine Learning Systems. [Link]
- Detailed breakdown of the course structure and content, exploring various aspects and applications of Machine Learning.
- Motivation, Syllabus, and other issues.
- 🎉 GitHub Education Benefits
- GitHub Education Pro: Get access to the GitHub Education Pro pack by visiting GitHub Education
- 📖 Learning Resources
- GitHub Learning Game: Check out the interactive Git learning game at GitHub Learning Game
- Basic Python: Enhance your Python skills through the Kaggle Python course.
- AI Python for Beginners: Learn Python programming fundamentals and how to integrate AI tools for data manipulation, analysis, and visualization. Andrew Ng
- Reading: Machines of Loving Grace
- 📖 Learning Resources
Week 02: Fundamentals and First Steps
- Nature of Human Intelligence versus Artificial Intelligence
- Types of Machine Learning
- 🎥 ML Fundamentals and Decision Trees
- Study the material of weeks 2,3 and 4.
Week 03: Visualizing Gradient Descent
- This week's lesson focuses on understanding and visualizing the five core steps of the Gradient Descent algorithm: (1) initializing parameters randomly, (2) performing the forward pass to compute predictions, (3) calculating the loss, (4) computing gradients with respect to each parameter, and (5) updating the parameters using the gradients and a predefined learning rate. We implemented these steps manually and using PyTorch's autograd and optimizer tools, analyzing how different configurations affect convergence.
Week 04: Rethinking the Training Loop (Part I)
- Rethinking the Training Loop
- Implement the training function: Create a dedicated function to execute training steps, implement a custom dataset class, and use data loaders to generate mini-batches.
- Perform mini-batch gradient descent: Develop a routine for mini-batch gradient descent and evaluate your model’s performance.
- Ensure model persistence: Save and checkpoint your model to disk, enabling you to resume training later or deploy the model.
- Going Classy
- Create a training class: Define a robust class to handle model training.
- Implement the constructor: Properly set up the class constructor.
- Handle method accessibility: Understand and apply the differences between public, protected, and private methods.
- Integrate the code: Organize and merge the previously developed code into the class structure.
- Execute the pipeline: Instantiate the class and use it to run an efficient training pipeline.
Week 05: Rethinking the Training Loop (Part II)
A simple classification problem:
- build a model for binary classification
- understand the concept of logits and how it is related to probabilities
- use binary cross-entropy loss to train a model
- use the loss function to handle imbalanced datasets
- understand the concepts of decision boundary and separability
Weeks 06 and 07: Guided Project - Binary Classification Problem
Weeks 08 and 09: Machine Learning and Computer Vision - Part I
From a shallow to a deep-ish clasification model:
- data generation for image classification
- transformations using torchvision
- dataset preparation techniques
- building and training logistic regression and deep neural network models using PyTorch
- focusing on various activation functions like Sigmoid, Tanh, and ReLU
Week 10: Machine Learning and Computer Vision - Part II
Kernel
Convolutions:
- In this lesson, we’ve introduced convolutions and related concepts and built a convolutional neural network to tackle a multiclass classification problem.
- Activation function, pooling layer, flattening, Lenet-5
- Softmax, cross-entropy
- Visualizing the convolutional filters, features maps and classifier layers
- Hooks in Pytorch
- In this lesson, we’ve introduced convolutions and related concepts and built a convolutional neural network to tackle a multiclass classification problem.
