A series of articles dedicated to deep learning. All codes and exercises of this section are hosted on GitHub in a dedicated repository :
The Rosenblatt’s Perceptron : An introduction to the basic building block of deep learning.
Multilayer Perceptron (MLP) : The MLP, or Artificial Neural Network, is a widely used algorithm in Deep Learning. What is it ? How do they learn ?
Full introduction to Neural Nets : A full introduction to Neural Nets from the Deep Learning Course in Pytorch by Facebook (Udacity).
How do Neural Networks learn? : Dive into feedforward process and back-propagation.
Activation functions in DL : An overview of the different activation functions in Deep Learning, how to implement them in Python, their advantages and disadvantages.
Prevent Overfitting of Neural Netorks : Your model overfits ? One of these techniques should help ! We’ll cover class imbalance, data augmentation, regularization, early stopping, reducing learning rate…
A guide to Inception Architectures in Keras : Inception is a deep convolutional neural network architecture that was introduced for the first time in 2014. It won the ImageNet Large-Scale Visual Recognition Challenge (ILSVRC14).
Xception and the Depthwise Separable Convolutions : Xception is a deep convolutional neural network architecture that involves Depthwise Separable Convolutions. It was developped by Google researchers.
Create an Auto-Encoder using Keras functional API : Autoencoder is a type a neural network widely used for unsupervised dimension reduction. So, how does it work ? What can it be used for ? And how do we implement it in Python ?
Like it? Buy me a coffee