, we explored how a Decision Tree Regressor chooses its optimal split by minimizing the Mean Squared Error (MSE).
Today for Day 7 of the Machine Learning “Advent Calendar”, we proceed the identical approach but...
. Machine Learning and Deep Learning are mentioned just as often.
And now, Generative AI seems to dominate nearly every technology conversation.
For a lot of professionals outside the AI field, this vocabulary will be confusing....
working with k-NN (k-NN regressor and k-NN classifier), we all know that the k-NN approach could be very naive. It keeps your entire training dataset in memory, relies on raw distances, and doesn't...
5 days of this Machine Learning “Advent Calendar”, we explored 5 models (or algorithms) which are all based on distances (local Euclidean distance, or global Mahalanobis distance).
So it's time to change the approach,...
Within the previous article, we explored distance-based clustering with K-Means.
further: to enhance how the gap could be measured we add variance, with the intention to get the Mahalanobis distance.
So, if k-Means is the...
4 of the Machine Learning Advent Calendar.
Through the first three days, we explored distance-based models for supervised learning:
In all these models, the thought was the identical: we measure distances, and we resolve the...
it’s possible to totally master every topic in data science?
With data science covering such a broad range of areas — statistics, programming, optimization, experimental design, data storytelling, generative AI, to call a couple...
the k-NN Regressor and the thought of prediction based on distance, we now take a look at the k-NN Classifier.
The principle is identical, but classification allows us to introduce several useful variants, reminiscent...