Skip to content
Home » Blog » Logistic Regression

Logistic Regression

Logistic regression is a statistical method used to model the probability of a certain outcome or event occurring based on a set of input variables. It is mostly used in classification tasks. Then why is it called regression? Because It doesn’t predict a class; rather, it predicts the probability of a data point belonging to a class.

Hypothesis function

    \[\hat{y} = h_{\theta}(x) = g(\theta^{T}x)\]

where:

    \[g(z) = \frac{1}{1+e^{-z}}\]

this function is called the sigmoid function or logistic function.

To better comprehend the hypothesis function of logistic regression, let’s break it down. The term θ^Tx, which is analogous to linear regression, represents the weighted sum of the input variables. However, in logistic regression, the sigmoid function is applied to this sum to transform it into a suitable form.

The sigmoid function plays a crucial role in logistic regression by squashing the output to a range between 0 and 1, allowing it to represent a probability.

Cost function

    \[BCE = -\frac{1}{n}\sum_{i=1}^{n}Cost(\hat{y}, y) = -\frac{1}{n}\sum_{i=1}^{n}y_{i}log(\hat{y_{i}}) + (1 - y_{i})log(1 - \hat{y_{i}})\]

The cost function in logistic regression, also known as the log loss or binary cross-entropy loss, is used to evaluate the model’s performance and determine how well it predicts the binary outcomes. The goal is to minimize the cost function to obtain the optimal set of model parameters.

Let us break down and understand the cost function.

    \[Cost(y) =\begin{cases}-log(\hat{y_{i}})& \text{ if } y= 1\\-log(1 - \hat{y_{i}})& \text{ if } y= 0\end{cases}\]

Plotting the graph for the above equation:

When y = 1, the cost is higher when predicted y is closer to 0 and the cost is lower when predicted y is closer to 1.
When y = 0, the cost is higher when predicted y is closer to 1 and the cost is lower when predicted y is closer to 0.

Optimizers

The optimizer is the same as in linear regression i.e. we use gradient descent optimizer for most of our algorithms. To know more about optimizers you can refer to:

Optimizers 1
Optimizers 2


Code example

Leave a Reply

Your email address will not be published. Required fields are marked *