Mathematical Basics Of Gradient Descent

Gradient descent is the backbone of a lot of machine learning algorithms, deep learning included. It is used only during the training, and it is the most computationally expensive part of machine learning. Gradient descent is a very complicated mathematical topic which I cannot explain in its entirety in a single blog post. However, I … Continue reading Mathematical Basics Of Gradient Descent

Rewriting Siraj Raval’s Game of Thrones Word Vectors Using Tensorflow

Siraj Raval's video on how to make word vectors out of five A Song Of Ice And Fire books is a helpful demonstration of word embeddings, but not so helpful as a tutorial because he uses a bunch of smaller libraries which enable you to do the training of the model in just a couple … Continue reading Rewriting Siraj Raval’s Game of Thrones Word Vectors Using Tensorflow

Activation functions explained

I love this kind of featured images. It makes people think that you are so smart and that you've figured out life. They're hilarious. It it makes the topic of the post seem like it's really¬†deep, pun intended. If you've worked with neural networks even a little bit, you've probably come across the term¬†activation function. … Continue reading Activation functions explained

What Is Dropout? – Deep Learning

Overfitting can be a serious problem in deep learning. Dropout is a technique developed to solve this exact problem. It is one of the biggest advancements in deep learning to come out in the last few years. What is dropout? Dropout is a technique for addressing the overfitting problem. The idea is to randomly drop … Continue reading What Is Dropout? – Deep Learning