Implementing Easy Neural Network Backpropagation from Scratch

-

Solving XOR gate problem— using just NumPy, then compare with PyTorch implementation.

Outline

・Introduction to the XOR Gate Problem

・Constructing a 2-Layer Neural Network

・Forward Propagation

・Chain Rules for Backpropagation

・Implementation with NumPy

・Comparing Results with PyTorch

・Summary

・References

Photo by Google DeepMind on Unsplash

Introduction to the XOR Gate Problem

The XOR (exclusive OR) gate problem is taken into account easy for a neural network since it involves learning an easy pattern of relationships between inputs and outputs that a properly designed network can capture, regardless that it is just not linearly separable (meaning you possibly can’t draw a single straight line to separate the outputs into two groups based on inputs). Neural networks, particularly those with hidden layers, are able to learning non-linear patterns.
Let’s have a look at the inputs and outputs of XOR Gate. Here is our 4 training data.

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x