Implementing BatchNorm in Neural Net

BatchNorm is a relatively new technique for training neural net. It gaves us a lot of relaxation when initializing the network and accelerates training.

Implementing Dropout in Neural Net

Dropout is one simple way to regularize a neural net model. This is one of the recent advancements in Deep Learning that makes training deeper and deeper neural net tractable.

Beyond SGD: Gradient Descent with Momentum and Adaptive Learning Rate

There are many attempts to improve Gradient Descent: some add momentum, some add adaptive learning rate. Let's see what's out there in the realm of neural nets optimization.

Implementing Minibatch Gradient Descent for Neural Networks

Let's use Python and Numpy to implement Minibatch Gradient Descent algorithm for a simple 3-layers Neural Networks.

Paralellizing Monte Carlo Simulation in Python

Monte Carlo simulation is all about quantity. It can take a long time to complete. Here's how to speed it up with the amazing Python multiprocessing module!

Japan Photo Essay: Kanto and Kansai

To much to talk about, not enough time and space... So let's digest Japan bit by bit, starting with two of the most popular regions first: Kanto and Kansai.

Scrapy as a Library in Long Running Process

Scrapy is a great web crawler framework, but it's tricky to make it runs as a library in a long-running process. Here's how!

Hong Kong, a Photo Essay

High rising skyscrapers and unspoiled nature; China but not China

Kota Kinabalu, a Photo Essay

Jungle, sea, small town, and seafood in 14 days.

SCUBA Diving in Kota Kinabalu

Kota Kinabalu, the capital of Sabah, Malaysia, offers such a great diving spot!