Neural Representation of AND, OR, NOT, XOR and XNOR Logic Gates (Perceptron Algorithm)
Neural Representation of AND, OR, NOT, XOR and XNOR Logic Gates (Perceptron Algorithm)
Neural Representation of AND, OR, NOT, XOR and XNOR Logic Gates (Perceptron Algorithm)
Also, the steps in this method are very similar to how Neural
Networks learn, which is as follows;
AND Gate
From our knowledge of logic gates, we know that an AND logic
table is given by the diagram below
AND Gate
The question is, what are the weights and bias for the AND
perceptron?
Row 1
Passing the first row of the AND logic table (x1=0, x2=0), we
get;
0+0–1 = -1
Row 2
From the Perceptron rule, this works (for both row 1, row 2 and
3).
Row 4
OR Gate
OR Gate
Row 1
Passing the first row of the OR logic table (x1=0, x2=0), we get;
0+0–1 = -1
Row 2
From the Perceptron rule, if Wx+b >= 0, then y`=1. This row is
again, correct (for both row 1, row 2 and 3).
Row 4
Passing (x1=1 and x2=1), we get;
1+1–1 = 1
Again, from the perceptron rule, this is still valid. Quite Easy!
NOT Gate
NOT Gate
Passing the first row of the NOT logic table (x1=0), we get;
0–1 = -1
Row 2
From the Perceptron rule, if Wx+b >= 0, then y`=1. This row is
so incorrect, as the output is 0 for the NOT gate.
So we want values that will make input x1=1 to give y` a value
of 0. If we change w1 to -1, we have;
-1+1 = 0
From the Perceptron rule, this still wouldn’t work, but it’s
better than the previous value. Changing b to -0.5
-1+0.5=-0.5
NOR Gate
NOR Gate
From the diagram, the NOR gate is 1 only if both inputs are 0.
Row 1
Passing the first row of the NOR logic table (x1=0, x2=0), we
get;
0+0-1 = -1
From the Perceptron rule, if Wx+b<0, then y`=0. This row is
incorrect, as the output is 1 for the NOR gate.
So we want values that will make input x1=0 and x2 = 0 to give
y` a value of 1. If we change b to 1, we have;
0+0+1 = 1
Row 2
From the Perceptron rule, if Wx+b >= 0, then y`=1. This row is
incorrect, as the output is 0 for the NOR gate.
So we want values that will make input x1=0 and x2 = 1 to give
y` a value of 0. If we change w1 to -1 and w2 to -1, we have;
0-1+1 = 0
Row 4
From the diagram, the NAND gate is 0 only if both inputs are 1.
Row 1
Passing the first row of the NAND logic table (x1=0, x2=0), we
get;
0+0-1 = -1
From the Perceptron rule, if Wx+b<0, then y`=0. This row is
incorrect, as the output is 1 for the NAND gate.
So we want values that will make input x1=0 and x2 = 0 to give
y` a value of 1. If we change b to 1, we have;
0+0+1 = 1
Row 2
From the Perceptron rule, if Wx+b >= 0, then y`=1. This row is
also correct (for both row 2 and row 3).
Row 4
It works.
XNOR Gate
Now that we are done with the necessary basic logic gates, we can
combine them to give an XNOR gate.
From the expression, we can say that the XNOR gate consists of an
AND gate (x1x2), a NOR gate (x1`x2`), and an OR gate.
This means we will have to combine 3 perceptrons:
AND (x1+x2–1.5)
NOR (-x1-x2+0.5)
OR (x1+x2–1)
XOR Gate
XOR Gate
From the simplified expression, we can say that the XOR gate
consists of an OR gate (x1 + x2), a NAND gate (-x1-x2+1) and an
AND gate (x1+x2–1.5).
OR (x1+x2–1)
NAND (-x1-x2+1)
AND (x1+x2–1.5)
CONCLUSION
In conclusion, this is just a custom method of achieving this, there
are many other ways and values you could use in order to achieve
Logic gates using perceptrons. For example;
AND (20x1+20x2–30)
OR (20x1+20x2–10)
NOT (-20x1+10)