5 days of this Machine Learning “Advent Calendar”, we explored 5 models (or algorithms) which are all based on distances (local Euclidean distance, or global Mahalanobis distance).
So it's time to change the approach,...
Within the previous article, we explored distance-based clustering with K-Means.
further: to enhance how the gap could be measured we add variance, with the intention to get the Mahalanobis distance.
So, if k-Means is the...
4 of the Machine Learning Advent Calendar.
Through the first three days, we explored distance-based models for supervised learning:
In all these models, the thought was the identical: we measure distances, and we resolve the...
it’s possible to totally master every topic in data science?
With data science covering such a broad range of areas — statistics, programming, optimization, experimental design, data storytelling, generative AI, to call a couple...
the k-NN Regressor and the thought of prediction based on distance, we now take a look at the k-NN Classifier.
The principle is identical, but classification allows us to introduce several useful variants, reminiscent...
to this “Advent Calendar” of Machine learning and deep learning in Excel.
For Day 1, we start with the k-NN (k-Nearest Neighbors) regressor algorithm. And as you will notice, this is absolutely the best...
You wrote many beginner and explanatory articles on TDS. Has teaching the basics modified the way you design or debug real systems at work?
I notice the correlation between the more I teach something, the...
, it is rather easy to coach any model. And the training process is at all times done with the seemingly same method fit. So we get used to this concept that training any...