, we ensemble learning with voting, bagging and Random Forest.
Voting itself is simply an aggregation mechanism. It doesn't create diversity, but combines predictions from already different models.Bagging, however, explicitly creates diversity by training...
previous article, we introduced the core mechanism of Gradient Boosting through Gradient Boosted Linear Regression.
That example was deliberately easy. Its goal was not performance, but understanding.
Using a linear model allowed us to make...
of this series, we'll speak about deep learning.
And when people speak about deep learning, we immediately consider these images of deep neural networks architectures, with many layers, neurons, and parameters.
In practice, the actual...
For 18 days, we've got explored many of the core machine learning models, organized into three major families: distance- and density-based models, tree- or rule-based models, and weight-based models.
Up so far, each article focused...
Neural Network Regressor, we now move to the classifier version.
From a mathematical viewpoint, the 2 models are very similar. In truth, they differ mainly by the interpretation of the output and the selection...
are sometimes presented as black boxes.
Layers, activations, gradients, backpropagation… it may feel overwhelming, especially when every thing is hidden behind model.fit().
We are going to construct a neural network regressor from scratch using Excel....
article about SVM, the subsequent natural step is Kernel SVM.
At first sight, it looks like a very different model. The training happens within the dual form, we stop talking a few slope and...
we're.
That is the model that motivated me, from the very starting, to make use of Excel to raised understand Machine Learning.
And today, you'll see a different explanation of SVM than you normally see,...