This is a new format in which I am going to try myself in the next couple of weeks – paper summaries. This is a paper by Xiang Zhang and…

I’ve been using Anki for a while now, and I have to say that it’s a been game changer. In this blog post, I am going to outline how I…

Gradient descent is the backbone of a lot of machine learning algorithms, deep learning included. It is used only during the training, and it is the most computationally expensive part…

The primary function of a feedfoward neural network is to create a prediction of some sorts. The most popular task that is handled by a feedfoward neural network is classification…

In part one of this series, I covered two very basic probability distributions – Bernoulli and multinouli. If you want to find out more about those, or if you wish…

Probability distributions are used in statistics to describe how likely a random variable is to take on each of it’s possible states. Random variables can be discrete and continuous. A…

Linear algebra is a branch of mathematics which deals with solving a system of linear equations. It is widely used throughout science and engineering and it is essential to understanding…

Pretrained networks are very useful. A pretrained network is a deep learning model which has been already trained on some data and the weights of the model have been made…

A few days ago, I published a blog post on writing a python program which transfers style onto a content image using Keras, which you can find here. The reason…

This has got to be one of the coolest implementations of machine learning. If you don’t know what neural style transfer is, it’s basically taking a content image, like a…