Home Artificial Intelligence Hyperparameter Tuning: Neural Networks 101

Hyperparameter Tuning: Neural Networks 101

0
Hyperparameter Tuning: Neural Networks 101

How you possibly can improve the “learning” and “training” of neural networks through tuning hyperparameters

Neural-network icons created by Vectors Tank — Flaticon. neural-network icons. https://www.flaticon.com/free-icons/neural

In my previous post, we discussed how neural networks predict and learn from the information. There are two processes chargeable for this: the forward pass and backward pass, also generally known as backpropagation. You’ll be able to learn more about it here:

This post will dive into how we will optimise this “learning” and “training” process to extend the performance of our model. The areas we are going to cover are computational improvements and hyperparameter tuning and tips on how to implement it in PyTorch!

But, before all that good things, let’s quickly jog our memory about neural networks!

When you are having fun with this text, be sure to subscribe to my YouTube Channel!

Click on the link for video tutorials that teach you core data science concepts in a digestible manner!

Neural networks are large mathematical expressions that try to seek out the “right” function that may map a set of inputs to their corresponding outputs. An example of a neural network is depicted below:

A basic two-hidden multi-layer perceptron. Diagram by writer.

Each hidden-layer neuron carries out the next computation:

LEAVE A REPLY

Please enter your comment!
Please enter your name here