Basic Logic Gates in Perceptrons

20 June 2022

Perceptrons are the mathematical implementation of a neuron, which has n inputs, and one output. In this article, I'm going to show you examples of logic gates in binary-output perceptrons. Binary perceptrons have a step function activation layer.

Note: The gates that we'll investigate has only two inputs.

Rosenbalt's Perceptron Model (from Wikipedia)

The tester function is given by the Facebook AI team in their Introduction to Neural Networks course in Udacity. I strongly suggest that course to everyone whose interested in with this topics.

AND Gate

To implement and gate, we should create a line which divides (1,1) and the others.

OR Gate

To implement and gate, we should create a line which divides (0,0) and the others.

NOT Gate

To implement and gate, we should create a line which is parallel to $x_1$ axis. We're going to consider $x_2$ as the only input since NOT gate has only one input and one input.

XOR Gate

That's the tricky part! One perceptron can only create one line. However, due to the nature of XOR operation, we cannot distinguish 1's and 0's with only one line. To achieve that gate in perceptron representation, we should use multi-layer perceptrons.

Can you think a line which seperetes reds and blacks?

We can create XOR gate with using one NOT, one OR, and one AND gate. You can see the representation in the figure below.

So, how can we write its code? Let's update the test function, and perceptron implementation that Facebook uses with object-oriented structures.

I hope you liked the article. Nowadays, I have been focusing into the topics of artificial intelligence and robotics applications of it. I guess you'll see lots of new articles.