# Introduction to Deep Learning Theory (video-tutorial)

Hi everyone,

When I started playing around with neural networks, I had a lot of questions regarding the theory of neural networks, and while many things were taken for granted, I always wanted to dig into more detail when it comes to how neural networks work.

All this made me want to make an easy to follow tutorial regarding the theory behind Deep Learning and Neural Networks, that would be descriptive and clear enough for the beginners to follow, but also a good recap for people already familiar with the theory.

**THE ENTIRE YOUTUBE TUTORIAL PLAYLIST CAN BE FOUND HERE**

The tutorial is still underway and more videos are added in about a weekly basis.

# 01 — Introduction

In this first video, you will be introduced to the different** types of neural networks**, types of** machine learning types**, various deep learning **implementations**, and generally all the primer you should heave before being involved in neural networks and deep learning.

Even though this first introductory video might look like something you would skip, I highly recommend watching it, as it will give you strong foundations on various types of neural networks and many of the architectures and frameworks available.

# 02 — Classification basics & Perceptrons

An easy to understand introduction to **Classification** and **Perceptrons** theory. We start by defining in depth the very basics, like weights, and then moving further to the foundation of Deep Learning theory. The Perceptron.

# 03 — Perceptron Algorithm, Error Functions, Sigmoid & Softmax Activation Functions

An early approach to finding a line for our **classifier**.

The **Perceptron Algorithm** describes this early technique for improving our classification line.

We now come to understand that an **error function** is what is needed to get an estimation of how good or bad our line is. So there is some reference here to the Error Function.

Then inevitably we start talking about a new type of **Activation Function** that will replace our **Step Function**. This is the **Sigmoid** Activation Function for Binary classification and the **SoftMax** Activation Function for multi-class classification.

# 04 — Maximum Likelihood, Cross-Entropy, One-Hot Encoding

The user is introduced to the** Cross-Entroy error function** for **Classification problems** by looking at the **Maximum Likelihood**.

Another big chapter that confuses many people is the** One-Hot Encoding**, that we make very clear at the last minutes of the video.

This is one of the most important lessons to understand if you want to know how Deep Neural Networks work.

# 05 — Regression, MAE & MSE Error Functions

**Regression** and the **MAE (Mean of Absolute Errors)** and **MSE (Mean of Squared Errors)** Error Functions for Deep Learning presented.

We compare **Regression vs Classification** and show the use of different error functions depending on what problem we try to solve.

This lesson is vital to understand the error functions as a whole.

**MORE VIDEOS TO BE ADDED IN A WEEKLY PACE**