think that linear regression is about fitting a line to data.
But mathematically, that’s not what it’s doing.
It's finding the closest possible vector to your goal throughout the space spanned by features.
To know this,...
learns machine learning often starts with linear regression, not simply because it’s easy, but since it introduces us to the important thing concepts that we use in neural networks and deep learning.
We already...
In my previous article I explained how YOLOv1 works and tips on how to construct the architecture from scratch with PyTorch. In today’s article, I'm going to deal with the loss function used to...
, we ensemble learning with voting, bagging and Random Forest.
Voting itself is simply an aggregation mechanism. It doesn't create diversity, but combines predictions from already different models.Bagging, however, explicitly creates diversity by training...
With Logistic Regression, we learned classify into two classes.
Now, what happens if there are greater than two classes.
n is just the multiclass extension of this concept. And we are going to discuss this...
Someday, a knowledge scientist told that Ridge Regression was an advanced model. Because he saw that the training formula is more complicated.
Well, this is precisely the target of my Machine Learning “Advent Calendar”, to...
Today’s model is Logistic Regression.
In the event you already know this model, here is an issue for you:
Is Logistic Regression a regressor or a classifier?
Well, this query is precisely like: Is a tomato a...
Regression, finally!
For Day 11, I waited many days to present this model. It marks the start of a latest journey on this “Advent Calendar“.
Until now, we mostly checked out models based on distances,...