# Basic Logic Gates in Perceptrons

Perceptrons are the mathematical implementation of a neuron, which has n inputs, and one output. In this article, I'm going to show you examples of logic gates in binary-output perceptrons. Binary perceptrons have a step function activation layer.

Note: The gates that we'll investigate has only two inputs.

The tester function is given by the Facebook AI team in their Introduction to Neural Networks course in Udacity. I strongly suggest that course to everyone whose interested in with this topics.

### AND Gate

To implement and gate, we should create a line which divides (1,1) and the others.

### OR Gate

To implement and gate, we should create a line which divides (0,0) and the others.

### NOT Gate

To implement and gate, we should create a line which is parallel to $x_1$ axis. We're going to consider $x_2$ as the only input since NOT gate has only one input and one input.

### XOR Gate

That's the tricky part! One perceptron can only create one line. However, due to the nature of XOR operation, we cannot distinguish 1's and 0's with only one line. To achieve that gate in perceptron representation, we should use multi-layer perceptrons.