Home Artificial Intelligence Implementing Easy Neural Network Backpropagation from Scratch

Implementing Easy Neural Network Backpropagation from Scratch

0
Implementing Easy Neural Network Backpropagation from Scratch

Solving XOR gate problem— using just NumPy, then compare with PyTorch implementation.

Outline

・Introduction to the XOR Gate Problem

・Constructing a 2-Layer Neural Network

・Forward Propagation

・Chain Rules for Backpropagation

・Implementation with NumPy

・Comparing Results with PyTorch

・Summary

・References

Photo by Google DeepMind on Unsplash

Introduction to the XOR Gate Problem

The XOR (exclusive OR) gate problem is taken into account easy for a neural network since it involves learning an easy pattern of relationships between inputs and outputs that a properly designed network can capture, regardless that it is just not linearly separable (meaning you possibly can’t draw a single straight line to separate the outputs into two groups based on inputs). Neural networks, particularly those with hidden layers, are able to learning non-linear patterns.
Let’s have a look at the inputs and outputs of XOR Gate. Here is our 4 training data.

LEAVE A REPLY

Please enter your comment!
Please enter your name here