Skip to content
Home ยป Blog

Blog

Weight initialization

Weight initialization is a procedure to set the weights of a neural network to some values that defines the starting point for the optimization. Generally, we follow some simple heuristics to initialize weights such as… Read More »Weight initialization

Optimizers 2

Adagrad – Adaptive gradient algorithm Adagrad adapts the learning rate for each parameter individually, by dividing the learning rate by the square root of the sum of the squares of the gradients for that parameter.… Read More »Optimizers 2

Optimizers 1

Optimizers in deep learning are algorithms that adjust the model parameters to minimize a loss function. Gradient descent optimizer is the most common optimizer and you would have heard of it. There are more optimizers… Read More »Optimizers 1

Activation function

An activation function is a mathematical function that is applied to the output of a neuron in a neural network. The purpose of an activation function is to introduce non-linearity into the output of a… Read More »Activation function

Loss functions

A loss function is a mathematical function that measures the difference between the predicted output of a model and the actual output. The goal of training a machine learning model is to minimize the value… Read More »Loss functions

Properties of CNN

Now let us see some properties of CNN which make it unique and distinguished and exclusive. Sparse interaction By making the kernel size small, CNN detects small meaningful features such as edges, etc. This reduces… Read More »Properties of CNN

Pooling layer

Now let us see one of the crucial layers used in convolution neural networks which helps in optimization/scaling when working with images. Suppose you have got a huge size image, something like 4000 x 3000.… Read More »Pooling layer

Batch normalization

The distribution of the inputs to a deep network may change after each mini-batch when the weights are updated. This can cause the learning algorithm to chase a moving target forever. This change in the… Read More »Batch normalization

Dense and Dropout layer

Now that we have covered the neural network basics and what a deep network is, let’s go through the layers of the deep network. Dense layer / Fully connected layer Let’s start with the simplest… Read More »Dense and Dropout layer