Mathematical Basics Of Gradient Descent

Gradient descent is the backbone of a lot of machine learning algorithms, deep learning included. It is used only during the training, and it is the most computationally expensive part of machine learning. Gradient descent is a very complicated mathematical topic which I cannot explain in its entirety in a single blog post. However, I … Continue reading Mathematical Basics Of Gradient Descent

Probability Distributions Part II – Gaussian, Exponential

In part one of this series, I covered two very basic probability distributions - Bernoulli and multinouli. If you want to find out more about those, or if you wish to learn a bit about what are probability distributions, discrete and continuous random variables, go here. In this post, we're covering two slightly bit more … Continue reading Probability Distributions Part II – Gaussian, Exponential

Probability Distributions Part I – Bernoulli, Multinoulli

Probability distributions are used in statistics to describe how likely a random variable is to take on each of it's possible states. Random variables can be discrete and continuous. A discrete random variable has a finite number of possible outcomes, whereas a continuous random variable has an infinite number of possible outcomes. The Bernoulli distribution … Continue reading Probability Distributions Part I – Bernoulli, Multinoulli

Vectors, Matrices, Tensors – What’s The Difference?

Linear algebra is a branch of mathematics which deals with solving a system of linear equations.¬† It is widely used throughout science and engineering and it is essential to understanding machine learning algorithms. Linear algebra defines three basic data-structures - vectors, matrices and tensors, which are constantly used in machine and deep learning. In this … Continue reading Vectors, Matrices, Tensors – What’s The Difference?

Where Do I Get My Pretrained Networks?

Pretrained networks are very useful. A pretrained network is a deep learning model which has been already trained on some data and the weights of the model have been made publicly available for free use. The famous example of a pretrained network is the VGG series of networks. VGG stands for "Visual Geometry Group", which … Continue reading Where Do I Get My Pretrained Networks?

Tensorflow Style Transfer That Actually Works

A few days ago, I published a blog post on writing a python program which transfers style onto a content image using Keras, which you can find here. The reason why I wrote it using Keras and not Tensorflow, is that I've been trying to write a functioning Tensorflow style transfer program for two weeks … Continue reading Tensorflow Style Transfer That Actually Works

Implementing A Convolutional Neural Network Using Tensorflow

Image recognition is currently my favorite type of machine learning. I say currently because I find¬† language translation and NLP quite interesting. Convolutional neural networks, at the time of writing this, are the most efficient and accurate method used for image recognition. While you could use a standard fully connected deep neural network with a … Continue reading Implementing A Convolutional Neural Network Using Tensorflow

Rewriting Siraj Raval’s Game of Thrones Word Vectors Using Tensorflow

Siraj Raval's video on how to make word vectors out of five A Song Of Ice And Fire books is a helpful demonstration of word embeddings, but not so helpful as a tutorial because he uses a bunch of smaller libraries which enable you to do the training of the model in just a couple … Continue reading Rewriting Siraj Raval’s Game of Thrones Word Vectors Using Tensorflow