Machine Learning Explained
Machine Learning Explained
  • 92
  • 324 202
What to Learn for Deep Learning? | 3 Roadmaps for Beginners
If you are a total beginner in deep learning and you are motivated, check out this video to get started with an efficient learning framework!
# Table of Content
- Introduction: 0:00
- Usual Roadmap: 0:56
- Issue with Said Roadmap: 1:43
- Best Roadmap: 3:31
- Set a Goal: 4:40
- Step 1: 5:40
- Step 2: 6:58
- Step 3: 8:22
- Machine Learning Engineer Roadmap: 10:11
- Researcher Roadmap: 12:48
- AI Product Roadmap: 15:52
- Conclusion: 18:34
Cool notebooks from the creator of Fast.AI you should check out:
📌 Image Recognition of Birds: www.kaggle.com/code/jhoward/is-it-a-bird-creating-a-model-from-your-own-data
Do check out this playlist to get started with the learning framework:
📌 ua-cam.com/play/PLKy51kKbOLlz2xeTAmR2-wyFR5TQhFw7W.html
----
Join the Discord for general discussion: discord.gg/QpkxRbQBpf
----
Follow Me Online Here:
Twitter: yacineaxya
GitHub: github.com/yacineMahdid
LinkedIn: www.linkedin.com/in/yacinemahdid/
___
Have a great week! 👋
Переглядів: 354

Відео

What are Pooling Layers in Deep Neural Networks?
Переглядів 4,9 тис.17 годин тому
You might be wondering, why we even need to do pooling in the first place for CNN. In this tutorial, we'll answer this question by exploring the three ways in which pooling is helpful for training deep vision models. # Table of Content - Introduction: 0:00 - 3 reasons to use pooling layers: 0:29 - What is pooling: 0:50 - pooling for dimensionality reduction: 3:48 - pooling for translational inv...
Network in Network Deep Neural Network Explained with Pytorch
Переглядів 433День тому
Network in Network (NiN) is a very influential paper that introduced two massively useful concepts to the next generation of deep neural network architecture. Namely, 1x1 convolution throughout the network and global average pooling for the classification layer. # Table of Content - Introduction: 0:00 - 1x1 convolution: 0:49 - Global Average Pooling: 2:33 - Datasets: 3:22 - Results: 3:43 - Code...
How to Set Up your Deep Learning Environment?
Переглядів 139День тому
Getting started with deep learning doesn't have to be complicated. The best way to set yourself up is to go on Colab or Kaggle and make your analysis from a notebook. # Table of Content - Use Online Notebooks: 0:00 - Main Benefit of Notebooks: 1:14 - Notebook Anatomy: 1:53 - Pros of Notebooks: 5:52 - Cons of Notebooks: 6:09 - Conclusion Use Notebooks: 7:29 Honestly, getting started is the one s...
Highway Networks - Deep Neural Network Explained
Переглядів 23114 днів тому
Highway Networks are a type of network inspired by LSTM that make use of learnable information highway to let inputs flow unimpeded to subsequent layers. They have a lot of similarities with residual neural networks and offer deep insight into how to make training deeper neural networks possible. # Table of Content - Introduction: 0:00 - Degradation Problem: 0:38 - Idea Behind Highway Networks:...
What are 1x1 Convolutions in Deep Learning?
Переглядів 2,4 тис.14 днів тому
You might have come across 1x1 convolution in deep learning architecture and wondered why they were there. In this tutorial, I'll walk you through their usefulness and how they compare to other dimensionality reduction techniques. # Table of Content - Introduction: 0:00 - 1x1 in networks: 0:11 - Convolutions: 1:40 - How to reduce dimensionality: 2:48 - What is 1x1 convolution doing?: 4:25 - Poo...
Inception Net [V1] Deep Neural Network - Explained with Pytorch
Переглядів 34821 день тому
GoogLeNet or Inception V1 is an interesting 22-layer deep neural network that made heavy use of 1x1 convolution in order to construct a competitive deep network in 2014. In this tutorial, we’ll take a look at it’s usage in Pytorch with CIFAR-10, a step-by-step explanation of the paper “Going Deeper with Convolutions” as well as a walkthrough of the Pytorch implementation. #Tabl of Content - Int...
FractalNet Deep Neural Network Explained
Переглядів 51328 днів тому
FractalNet is an alternative to residual neural networks that was built to showcase that it wasn't the residual aspect of resnet that made it so powerful. Indeed, the FractalNet paper was able to show that it was the training of sub-paths of different lengths that made the network much more performant than VGG. In this tutorial, we'll go through the paper and a Pytorch implementation of Fractal...
What is VGG in Deep Learning?
Переглядів 2,7 тис.Місяць тому
Depth in neural networks is a very important parameter. Before ResNet, the VGG network was able to prove its importance by scaling up AlexNet to 16-19 layers. In this tutorial, we'll take a look at the theory behind the architecture as well as a Pytorch implementation from the official documentation. # Table of Content - The Importance of Depth in Neural Networks: 0:00 - VGG Network Architectur...
Stochastic Depth for Neural Networks Explained
Переглядів 1,1 тис.Місяць тому
Stochastic depth is a powerful regularization method applied on residual neural networks that speed up training and enhances test performance. In this tutorial, we'll go through the methodology and a Pytorch implementation. # Table of Content - Introduction: 0:00 - Background and Context: 0:30 - Questions: 2:46 - Architecture Changes: 3:00 - Why use the resnet architecture for stochastic depth?...
DenseNet Deep Neural Network Architecture Explained
Переглядів 1,7 тис.Місяць тому
DenseNets are a variation on ResNets that swap the identity addition for concatenation operations. This has many benefits, mainly better performance for smaller parameter sizes. In this video, we'll discuss this architecture and do a full Pytorch implementation walkthrough. Table of Content - Introduction: 0:00 - Background and Context: 0:26 - Architecture: 3:25 - Data Set: 7:29 - Main Results:...
How to Run ML models on the Browser?
Переглядів 1592 місяці тому
Running ML models in the browser is now pretty easy with Transfomer.js. In this tutorial, I walk through a very simple JS app that does object recognition client-side! Code: github.com/yacineMahdid/transformer-js-test-app # Table of Content - Introduction: 0:00 - Example App: 0:32 - index.html: 1:44 - style.css: 2:42 - index.js: 3:20 - file upload event listener: 5:07 - detect function: 6:02 - ...
ResNet Deep Neural Network Architecture Explained
Переглядів 2,1 тис.2 місяці тому
ResNets are a very useful deep neural network architecture that excel at computer vision. Here we will discuss the core insight from the architecture and look at a Pytorch implementation! Table of Content - Introduction: 0:00 - Background and Context: 1:18 - Data Set: 5:28 - Architecture: 6:12 - Main Results: 10:41 - Limitation: 12:58 - Pytorch Walkthrough: 13:58 - High-Level Pytorch API: 15:12...
Img2Vec and UMAP to Visualize High Dimensional Data
Переглядів 4152 місяці тому
A very cool visualization trick that I’ve used recently for vision classification: image → embedding →UMAP projection→ dynamic visualization 💡 Notebook for this project: 📌 www.kaggle.com/code/yacine0101/aerospace-engine-defect-detection # Table of Content - Introduction: 0:00 - Project Overview: 0:30 - Data Overview: 2:45 - Getting Embeddings with Img2Vec: 4:12 - Reducing Dimensionality with UM...
Cosine Similarity for Data Science Tutorial
Переглядів 4193 місяці тому
Cosine similarity is a measure of how two multi-dimensional data points are alike. We'll explore the metric along with a Python example in this tutorial. # Table of Content - Introduction: 0:00 - Context: 0:24 - Example of Cosine Similarity Usage - Graph: 1:27 - Example of Cosine Similarity Usage - NLP: 2:44 - Theory: 3:45 - Formula: 5:15 - Kaggle Notebook Overview: 6:34 - Python Implementation...
LLMs Learn Tools at 775M Parameters!
Переглядів 7245 місяців тому
LLMs Learn Tools at 775M Parameters!
Solving Boggle using AI : Dynamic Programming + Trie in Python
Переглядів 64211 місяців тому
Solving Boggle using AI : Dynamic Programming Trie in Python
What is Data Science? - Introduction to Data Science with Python Workshop
Переглядів 372Рік тому
What is Data Science? - Introduction to Data Science with Python Workshop
How to Solve Programming Problem | 3 Steps Framework
Переглядів 369Рік тому
How to Solve Programming Problem | 3 Steps Framework
How to Start a Data Science Project?
Переглядів 6082 роки тому
How to Start a Data Science Project?
How to Ask for Help in your Data Science Project!
Переглядів 2472 роки тому
How to Ask for Help in your Data Science Project!
Shannon Entropy from Theory to Python
Переглядів 5 тис.2 роки тому
Shannon Entropy from Theory to Python
Handling Negative Result in Data Science Project!
Переглядів 3402 роки тому
Handling Negative Result in Data Science Project!
Fact Check and Document All Data Science Assumptions ✅
Переглядів 1332 роки тому
Fact Check and Document All Data Science Assumptions ✅
Kolmogorov Complexity With DNA Data in Python
Переглядів 8992 роки тому
Kolmogorov Complexity With DNA Data in Python
Why your Data Science Project Exist?
Переглядів 5662 роки тому
Why your Data Science Project Exist?
Distance Metrics Explained and Visualized in Python
Переглядів 2,4 тис.3 роки тому
Distance Metrics Explained and Visualized in Python
Why you shouldn't use K-Fold Cross Validation.
Переглядів 4463 роки тому
Why you shouldn't use K-Fold Cross Validation.
K-Nearest Neighbor from Scratch in Python
Переглядів 2,3 тис.3 роки тому
K-Nearest Neighbor from Scratch in Python
How to Structure Data Science Project
Переглядів 6 тис.3 роки тому
How to Structure Data Science Project

КОМЕНТАРІ

  • @Param3021
    @Param3021 6 днів тому

    Too underrated video! Thank you so much for this video, now my vision is clear and I have set my goal - ML Engineer Job. Will soon comment here back, when I get a job/internship!

    • @machinelearningexplained
      @machinelearningexplained 6 днів тому

      Cool stuff, do you have a target company in mind? I could make you a more detailed roadmap.

  • @thouys9069
    @thouys9069 6 днів тому

    I still don't get why it's supposed to help translational invariance. As you say, the convolution is already capable of that. If I move the image contents 50 pixels to the right, the features should also move 50 pixels, given stride 1 and padding. Exactly the same is true for traditional sobel edge detectors. The edges don't change if I convolve the image with the edge detection filters translated or not

    • @machinelearningexplained
      @machinelearningexplained 6 днів тому

      No the convolution help with translational equivariance, but not with translational invariance. The idea of adding pooling is that you are forcing exact nearby pixel/feature information to be “lost”. This means that the network is forced to learn more generalizeable components of the picture you are showing it.

  • @simonvutov7575
    @simonvutov7575 7 днів тому

    Hey, great video!! Do you go to university in montreal? I'm from ottawa and will be going to Waterloo for computer engineering.

    • @machinelearningexplained
      @machinelearningexplained 7 днів тому

      Hey there, I was going at McGill yes! Cool stuff, best of luck in your computer engineering journey :)

  • @luisluiscunha
    @luisluiscunha 9 днів тому

    You made this concept so easy to understand: thank you!

  • @rafa_br34
    @rafa_br34 11 днів тому

    Very useful indeed, I just think the 3D representation could be a bit better (I'm used to seeing the filter rectangle behind the first layer and not on the side, but that's probably just me)

    • @machinelearningexplained
      @machinelearningexplained 11 днів тому

      True, it’s titled by about 90 degrees. Otherwise the 1x1 convolution wouldn’t fit well in the image I believe.

  • @himadrilabana8233
    @himadrilabana8233 11 днів тому

    Is this code for works only for same size of images ? Because it was giving while i used for my images

    • @machinelearningexplained
      @machinelearningexplained 11 днів тому

      Hey there, it’s been a while I’ve checked this code. What were your image size?

  • @pachecotaboadaandrejoaquin6727
    @pachecotaboadaandrejoaquin6727 16 днів тому

    Thank you for breaking it down so well! Keep up the excellent work!

    • @machinelearningexplained
      @machinelearningexplained 16 днів тому

      Hey thanks for the kind words! Will do, I have a few more videos ready for next week :) Do let me know if there is a specific topic or question you would like covered!

  • @HeyySujal
    @HeyySujal 18 днів тому

    I like your explanations, but Im watching it randomly As I'm beginner, where should I start?

    • @machinelearningexplained
      @machinelearningexplained 18 днів тому

      Glad you liked it. It’s the second time this week I had this request for where to start in deep learning, I’m setting up a video on that topic will publish soon!

  • @JuliusSmith
    @JuliusSmith 19 днів тому

    Shouldn't we call it a 1 x 1 x Cin convolution?

    • @machinelearningexplained
      @machinelearningexplained 19 днів тому

      That would indeed be a less confusing name for sure. That thing already have like 7 different names though haha

  • @VigneshBhaskar
    @VigneshBhaskar 21 день тому

    Thanks a lot for the content. One small request. Can you pls reduce the background music volume next time?

    • @machinelearningexplained
      @machinelearningexplained 20 днів тому

      Hey there, Yes sorry for the inconvenience, I've improved the sound in my new videos :)!

  • @dgs1977
    @dgs1977 23 дні тому

    Very interesting! Thank you so much! :)

  • @qasimjan5258
    @qasimjan5258 26 днів тому

    I am totally new to machine learning. From where do I start?

  • @AbhishekSaini03
    @AbhishekSaini03 Місяць тому

    Thanks , how can we use VGG for 1D signal? Is it possible to use VGG for regression instead of classification, how?

    • @machinelearningexplained
      @machinelearningexplained Місяць тому

      Hmmm depends, what’s the 1D signal about? Is it visual?

    • @AbhishekSaini03
      @AbhishekSaini03 Місяць тому

      It’s acoustic signal.

    • @machinelearningexplained
      @machinelearningexplained Місяць тому

      Ah then no, VGG shouldn’t be your pick here. It was expressively designed for image classification. Take a look at the various model on PyTorch made specifically for audio signal: 📌 pytorch.org/audio/stable/models.html

    • @AbhishekSaini03
      @AbhishekSaini03 Місяць тому

      Can’t we change output layer, activation function to do regression?

    • @machinelearningexplained
      @machinelearningexplained Місяць тому

      Yes you can, but the internal of the model is tailor built for image. If you are able to express your 1D signal input as an image I would say it might be worth it to try. However, there are other models made specifically for audio.

  • @kukfitta2258
    @kukfitta2258 Місяць тому

    very cool thank you for the knowledge

  • @victorisrael6191
    @victorisrael6191 Місяць тому

    Glorious😳

  • @machinelearningexplained
    @machinelearningexplained Місяць тому

    Hey, fyi I had to reshoot some of the section on this video because I couldn’t stop saying Drop Path (from Fractal Net) instead of Stochastic Depth. There is still 1 wrong mention of drop path in there that I wasn’t able to fix haha That's what you get from reading two paper simultaneously!😅

  • @HasanRoknabady
    @HasanRoknabady Місяць тому

    thank you very much for your nice work can i have your slides?

    • @machinelearningexplained
      @machinelearningexplained Місяць тому

      Thanks, for sure! You can shoot me an email at mail@yacinemahdid.com and I'll send them to you.

  • @sandeepbhatti3124
    @sandeepbhatti3124 Місяць тому

    Thank you! Exactly what I needed.

  • @JamesColeman1
    @JamesColeman1 Місяць тому

    Nice work, but why was the first file overall a larger file to start? Character quantity?

    • @mprone
      @mprone Місяць тому

      Yes, but this shouldn't have happened. The first file contains 2M characters, while the second only 1M thus |file_2| = 2 * |file_1|. The author of the video wanted to have a first file with 500'000 "ab" (thus 1M chars) but that's not what he did.

  • @camelendezl
    @camelendezl 3 місяці тому

    Amazing video! Thanks!

  • @machinelearningexplained
    @machinelearningexplained 3 місяці тому

    Who the heck is Tanimoto

  • @DistortedV12
    @DistortedV12 4 місяці тому

    This is SUPER helpful. I was looking online for a good example using sklearn to no avail. Even asked ChatGPT and was led astray.

  • @JeffSzuc
    @JeffSzuc 4 місяці тому

    Thank you! this was exactly the explanation I needed

  • @Miami_adana09346
    @Miami_adana09346 5 місяців тому

    Will please tell me how to do ndam optimization in matlab for deep learning

    • @machinelearningexplained
      @machinelearningexplained 5 місяців тому

      Hey there! For sure, do you already have some code that I can take a look at ? Also, why are you using MATLAB?

  • @Yashchaudhary-be2bu
    @Yashchaudhary-be2bu 6 місяців тому

    thank you so much it helped me in project lot

  • @akhtarbiqadri1
    @akhtarbiqadri1 7 місяців тому

    can you give me the link to the exact jupyter notebook? I can't find the exact same jupyter notebook from the link that you provide on the description

  • @ufukthegreat0
    @ufukthegreat0 8 місяців тому

    Appreciate it man thank you. This is golden.

  • @ghinwamasri5537
    @ghinwamasri5537 8 місяців тому

    Thank you for your concise and clear explanation!

  • @Diekartoffel1
    @Diekartoffel1 9 місяців тому

    If I'm not mistaken you only return the entropy for the last checked nucleotide: entropy = 0 for nucleotide in {'A', 'T', 'G', 'C', 'N'}: rel_freq = dna_sequence.count(nucleotide) / len(dna_sequence) if rel_freq > 0: entropy = entropy - (rel_freq * math.log(rel_freq, 2)) return entropy

  • @MalichiMyers
    @MalichiMyers 10 місяців тому

    hi

    • @machinelearningexplained
      @machinelearningexplained 10 місяців тому

      Hello 👋 Let me know if you have questions!

    • @MalichiMyers
      @MalichiMyers 10 місяців тому

      ​@@machinelearningexplained Do you know a way to use the MNIST dataset without using sklearn, pytorch, or tensorflow? If not, what are some datasets that you recommend?

  • @yacine997
    @yacine997 11 місяців тому

    This is a great project for beginners, thanks for this valuable information !

    • @machinelearningexplained
      @machinelearningexplained 11 місяців тому

      Glad it was interesting! I remember the first time I've attempted to make a boggle solver, my DFS based algorithm was extremely inefficient. I had to build it in C and have parallelization in place to barely hit the 2min mark for the game. Once you use dynamic programming and a search-optimized trie structure it's a world of difference.

  • @haoduong6565
    @haoduong6565 Рік тому

    Hi, can you share example on fine and gray modeling? Also, where I can get these codes? thanks!

    • @machinelearningexplained
      @machinelearningexplained 11 місяців тому

      Hey there, Sorry for the wait this comment slipped through! What do you mean fine and gray modeling?

  • @mitchellshields9904
    @mitchellshields9904 Рік тому

    promo sm

  • @yacine997
    @yacine997 Рік тому

    Love this video !

  • @tuberclebacilli9417
    @tuberclebacilli9417 Рік тому

    Where are you from?

    • @machinelearningexplained
      @machinelearningexplained Рік тому

      I'm originally from Algeria, but I've lived in Canada for most of my life! btw you can join the discord for general casual chat : discord.gg/QpkxRbQBpf Easier for me to follow up!

  • @tuberclebacilli9417
    @tuberclebacilli9417 Рік тому

    Keep going 💪💖

  • @tuberclebacilli9417
    @tuberclebacilli9417 Рік тому

    Welcome back

  • @ishanmistry8479
    @ishanmistry8479 Рік тому

    You're back again! That's amazing ✨

  • @sohailraza2005
    @sohailraza2005 Рік тому

    Can you please help me with rainfall problem with deepxde and physics informed neural network

    • @machinelearningexplained
      @machinelearningexplained Рік тому

      Yes sir for sure, shoot me a message on LinkedIn I have some time tomorrow for a video-call 📌 www.linkedin.com/in/yacinemahdid/

    • @sohailraza2005
      @sohailraza2005 Рік тому

      @@machinelearningexplained please accept my request, and thank you sooooooooo much for replying me

  • @AJ-et3vf
    @AJ-et3vf Рік тому

    So for nesterov gradient, the learning rate would have to be set constant and not found automatically using line search for each iteration?

  • @AJ-et3vf
    @AJ-et3vf Рік тому

    Great video. Thank you

  • @AJ-et3vf
    @AJ-et3vf Рік тому

    Great video. Thank you

  • @kevon217
    @kevon217 Рік тому

    Very helpful. Thanks!

  • @Anastasiyofworld
    @Anastasiyofworld Рік тому

    Nice tutorial, thanks a lot. But! If you explain something - just explain. Watching how you was moving the pieces of code was so annoying!

    • @machinelearningexplained
      @machinelearningexplained Рік тому

      Hi there, Thanks for the feedback! Will improve that part in the next tutorial for sure :)

  • @Detective_Jones
    @Detective_Jones Рік тому

    i'm so frustrated that i don't know *hope i can learn to code math formulas in future* if anyone got some guidance please give

    • @machinelearningexplained
      @machinelearningexplained Рік тому

      Hey there Jonas, What do you mean? Is there a math formula in particular you are struggling with?

  • @Lorenzo8690
    @Lorenzo8690 Рік тому

    thank you very much for the tutorials and code! But I don't quite understand why both AdaGrad and AdaDelta perform poorly for these examples?

    • @machinelearningexplained
      @machinelearningexplained Рік тому

      Hey there, Glad it was useful! The example use a very very basic formula with which AdaGrad/Delta are way too overkill for. It was more to illustrate that we could code these formula and they can work in practice. To make them work well I would have had to tweak the hyperparameters some more. In a neural network though they are good optimizer!

    • @Lorenzo8690
      @Lorenzo8690 Рік тому

      @@machinelearningexplained thank you very much, I used your work to create a colab file in which I animated the different algorithms exposed :)

    • @machinelearningexplained
      @machinelearningexplained Рік тому

      @@Lorenzo8690 wow cool, shoot me a link I'll check it out!