Gradient descent is an optimization algorithm which finds a local minimum of a function by taking steps proportional to the negative of the gradient of the function at the current point. In a machine learning problem, the function usually is the cost function you want to...

## Logistic Regression

posted by Bertrand G.

The logistic regression is a type of regression which is used to predict an outcome which comes in a categorical form. It is widely used in biostatistics where binary reponses occur quite frequently – such as if somebody has cancer or not. In order to keep the outcome between 0 and 1, we...

## Shrinkage Methods fo...

posted by Bertrand G.

By discarding part of the predictors or inputs and keeping only a subset of the original predictors, you may obtain a model which is more interpretable. In addition to that, it might have a better prediction error on new datasets by preventing over fitting the training dataset. In all...

## Linear Regression

posted by Bertrand G.

A linear regression model assumes that the regression function is linear in the inputs. Linear models are quite simple and often provide an adequate description of how the inputs affect the output. In order to build a more complex model, man can first transform the inputs using for example log...

## Statistical Learning...

posted by Bertrand G.

In a supervised learning scenario, we have an outcome, which can be quantitative (real estate price) or categorical (spam email) that we wish to predict based on a set of features (size in square meters, number of rooms). The training dataset procures us both a set of outcomes and features,...