of my Machine Learning Advent Calendar.
Before closing this series, I would really like to sincerely thank everyone who followed it, shared feedback, and supported it, specifically the Towards Data Science team.
Ending this calendar...
were first introduced for images, and for images they are sometimes easy to know.
A filter slides over pixels and detects edges, shapes, or textures. You possibly can read this text I wrote earlier...
, we ensemble learning with voting, bagging and Random Forest.
Voting itself is simply an aggregation mechanism. It doesn't create diversity, but combines predictions from already different models.Bagging, however, explicitly creates diversity by training...
previous article, we introduced the core mechanism of Gradient Boosting through Gradient Boosted Linear Regression.
That example was deliberately easy. Its goal was not performance, but understanding.
Using a linear model allowed us to make...
of this series, we'll speak about deep learning.
And when people speak about deep learning, we immediately consider these images of deep neural networks architectures, with many layers, neurons, and parameters.
In practice, the actual...
For 18 days, we've got explored many of the core machine learning models, organized into three major families: distance- and density-based models, tree- or rule-based models, and weight-based models.
Up so far, each article focused...
Neural Network Regressor, we now move to the classifier version.
From a mathematical viewpoint, the 2 models are very similar. In truth, they differ mainly by the interpretation of the output and the selection...
are sometimes presented as black boxes.
Layers, activations, gradients, backpropagation… it may feel overwhelming, especially when every thing is hidden behind model.fit().
We are going to construct a neural network regressor from scratch using Excel....