Gradient descent is the backbone of a lot of machine learning algorithms, deep learning included. It is used only during the training, and it is the most computationally expensive part of machine learning. Gradient descent is a very complicated mathematical topic which I cannot explain in its entirety in a single blog post. However, I … Continue reading Mathematical Basics Of Gradient Descent
The primary function of a feedfoward neural network is to create a prediction of some sorts. The most popular task that is handled by a feedfoward neural network is classification or categorization. Classification is a task where a program is handed some sort of data, and the program classifies the data as something. One popular … Continue reading Neural Network Output Units
In part one of this series, I covered two very basic probability distributions - Bernoulli and multinouli. If you want to find out more about those, or if you wish to learn a bit about what are probability distributions, discrete and continuous random variables, go here. In this post, we're covering two slightly bit more … Continue reading Probability Distributions Part II – Gaussian, Exponential
Probability distributions are used in statistics to describe how likely a random variable is to take on each of it's possible states. Random variables can be discrete and continuous. A discrete random variable has a finite number of possible outcomes, whereas a continuous random variable has an infinite number of possible outcomes. The Bernoulli distribution … Continue reading Probability Distributions Part I – Bernoulli, Multinoulli
Linear algebra is a branch of mathematics which deals with solving a system of linear equations. It is widely used throughout science and engineering and it is essential to understanding machine learning algorithms. Linear algebra defines three basic data-structures - vectors, matrices and tensors, which are constantly used in machine and deep learning. In this … Continue reading Vectors, Matrices, Tensors – What’s The Difference?
I love this kind of featured images. It makes people think that you are so smart and that you've figured out life. They're hilarious. It it makes the topic of the post seem like it's really deep, pun intended. If you've worked with neural networks even a little bit, you've probably come across the term activation function. … Continue reading Activation functions explained
Overfitting can be a serious problem in deep learning. Dropout is a technique developed to solve this exact problem. It is one of the biggest advancements in deep learning to come out in the last few years. What is dropout? Dropout is a technique for addressing the overfitting problem. The idea is to randomly drop … Continue reading What Is Dropout? – Deep Learning