Deriving LSTM Gradient for Backpropagation

Deriving neuralnet gradient is an absolutely great exercise to understand backpropagation and computational graph better. In this post we will walk through the process of deriving LSTM net gradient so that we can use it in backpropagation.

Guide to Get Iranian Visa on Arrival

Step by step guide to get Iranian Visa on Arrival at Imam Khomeini Airport, Tehran.

How to Buy Bus Ticket to Yerevan from Tehran

Step by step guide to get a bus ticket to Yerevan, Armenia in Tehran, Iran

Central Hokkaido: Furano, Biei, Asahikawa

Sapporo, the main city of Hokkaido, is a perfect base to explore the central region of Hokkaido Island!

Convnet: Implementing Maxpool Layer with Numpy

Another important building block in convnet is the pooling layer. Nowadays, the most widely used is the max pool layer. Let's dissect its Numpy implementation!

Convnet: Implementing Convolution Layer with Numpy

Convnet is dominating the world of computer vision right now. What make it special of course the convolution layer, hence the name. Let's study it further by implementing it from scratch using Numpy!

Daytrip to Ise from Nagoya

Nagoya is not the most interesting sightseeing city in Japan. But, it's an excellent base for daytrips! Just an hour away from Nagoya, Ise is one of those perfect daytrips destination.

Philippines Photo Essay

Although, as any other developing countries, the big cities could be overwhelming; smaller cities, the nature, and the beaches are the real gems!

Implementing BatchNorm in Neural Net

BatchNorm is a relatively new technique for training neural net. It gaves us a lot of relaxation when initializing the network and accelerates training.

Implementing Dropout in Neural Net

Dropout is one simple way to regularize a neural net model. This is one of the recent advancements in Deep Learning that makes training deeper and deeper neural net tractable.