SkillAgentSearch skills...

Mlops

Repository for DCA0305, an undergraduate course about Machine Learning Workflows and Pipelines

Install / Use

/learn @ivanovitchm/Mlops
About this skill

Quality Score

0/100

Supported Platforms

Universal

README

<center><img width="800" src="images/ctec.jpeg"></center>

Federal University of Rio Grande do Norte

Technology Center

Department of Computer Engineering and Automation

Repository for the Machine Learning Based Systems Design course, offered as an elective in the Computer Engineering undergraduate program at UFRN.

📚 References

| Title & Authors | Date | Link | |-----------------|------|------| | Muhammad Asad and Iqbal Khan<br>NLP with Hugging Face Transformers: Practical Applications using Language Models | May, 2025 | :books: Link | | Chip Huyen<br>AI Engineering: Building Applications with Foundation Models | Jan, 2025 | :books: Link | | Paul Lusztin and Maxime Labonne<br>LLM Engineer's Handbook | Oct, 2024 | :books: Link | | Jay Alammar and Maarten Grootendorst<br>Hands-On Large Language Models: Language Understanding and Generation | Sep, 2024 | :books: Link | | Chip Huyen<br>Designing Machine Learning Systems: An Iterative Process for Production-Ready Applications | May, 2022 | :books: Link | | Daniel Voigt Godoy<br>Deep Learning with PyTorch Step-by-Step: A Beginner’s Guide | Feb, 2022 | :books: Link |


🔗 Useful Links

| Resource | Description | |----------|-------------| | Hugging Face Docs | Model store and documentation | | MLflow | Experiment tracking | | Weights & Biases | Machine learning monitoring |

Lessons

Week 01

  • Open in PDF Course Outline
    • GitHub Education Pro: Get access to the GitHub Education Pro pack by visiting GitHub Education
    • 📖 Learning Resources
    • Michael A. Lones. How to avoid machine learning pitfalls: a guide for academic researchers Arxiv
  • Open in PDF Visualizing Gradient Descent
    • Understanding and visualizing the five core steps of the Gradient Descent algorithm:
      1. initializing parameters randomly
      2. performing the forward pass to compute predictions
      3. calculating the loss
      4. computing gradients with respect to each parameter
      5. updating the parameters using the gradients and a predefined learning rate.

Week 02

  • Open in PDF Rethinking the Training Loop (Part I)
    • Jupyter From data deneration to make predictions
      • Implement a clear train() function with custom dataset and DataLoader.
      • Apply mini-batch gradient descent and track performance.
      • Add persistence: save checkpoints and enable training resumption/deployment.
    • Jupyter Going Classy
      • Build a dedicated training class with a well-structured constructor.
      • Use proper method scoping (public/protected/private).
      • Consolidate earlier code into the class.
      • Run the full pipeline through the class interface.

Week 03

  • Jupyter Inside AirBnB Case Study: Multivariate Regression Problem.

Week 04:

  • Open in PDF Rethinking the Training Loop (Part II)
    • Jupyter A simple classification problem:
      • build a model for binary classification
      • understand the concept of logits and how it is related to probabilities
      • use binary cross-entropy loss to train a model
      • use the loss function to handle imbalanced datasets
      • understand the concepts of decision boundary and separability
    • Jupyter Challenge (bonus: 2.5 points)

Week 05

  • Open in PDF Machine Learning and Computer Vision - Part I
    • Jupyter From a shallow to a deep-ish clasification model:
      • data generation for image classification
      • transformations using torchvision
      • dataset preparation techniques
      • building and training logistic regression and deep neural network models using PyTorch
      • focusing on various activation functions like Sigmoid, Tanh, and ReLU

Week 06

  • Open in PDF Machine Learning and Computer Vision - Part II
    • Jupyter Kernel
    • Jupyter Convolutions
    • In this lesson, we’ve introduced convolutions and related concepts and built a convolutional neural network to tackle a multiclass classification problem.
      • Activation function, pooling layer, flattening, Lenet-5
      • Softmax, cross-entropy
      • Visualizing the convolutional filters, features maps and classifier layers
      • Hooks in Pytorch

Week 07

  • Open in PDF Machine Learning and Computer Vision - Part III
  • Jupyter Rock, Paper and Scissors:
    • Standardize an image dataset
    • Train a model to predict rock, paper, scissors poses from hand images
    • Use dropout layers to regularize the model

Week 08

  • Open in PDF Machine Learning and Computer Vision - Part III Cont.
  • Jupyter Rock, Paper and Scissors:
    • Learn how to find a learning rate to train the model
    • Understand the use of adaptive learning rates

Related Skills

View on GitHub
GitHub Stars105
CategoryEducation
Updated10d ago
Forks18

Languages

Jupyter Notebook

Security Score

95/100

Audited on Mar 26, 2026

No findings