This is part 3 of an ongoing multi part series focusing on learning about & building Artificial Neural Networks using the FANN library in conjunction with PHP on c9.io.

Earlier Posts in this Series:

Part 1

Part 2

We covered a lot of ground on my last post and the fact that you are still reading is just awesome! 🙂

In this post we will expand on what we already learned by building ANN’s that perform the following operations:

AND, OR, NOT, NAND, NOR, XNOR

On each of these wiki articles is a ‘truth table’ on the top right of the page.

Previously we trained an ANN on the XOR operation. Now we are going to use the truth tables for these other operations to create training sets for each.

Before we do that however, lets look at the XOR truth table one more time:

XOR Truth Table
INPUT OUTPUT
A B A XOR B
0 0 0
0 1 1
1 0 1
1 1 0

The two differences between the truth table on Wikipedia and the training set we used to train XOR are:

  1. The table on Wikipedia reflects true/false, high/low or on/off states using traditional notation of 1 & 0 whereas our training data uses 1 & -1.
  2. The output is displayed on the last column of the truth table whereas our training data has the inputs on the first row of a set and the output on the second row.

Example:

1 1
-1

So if we copy the truth tables for AND, OR, NOT, NAND, NOR, XNOR from Wikipedia, change the 0’s to -1’s and arrange the inputs and outputs as needed we get the following training sets:

AND

AND is the operation for ‘Logical conjunction‘ (as in A in conjunction with B) and requires all results to be the same in order to be true. Below is a training set that takes two inputs and will only output a true (1) if both inputs are 1 meaning input A AND input B were 1.

Think of AND like the sign that reads: “No shirt, No Shoes, No Service”. You could rewrite that to say: “You will only receive service if you are wearing both a Shirt AND Shoes”. If you only have on a shirt but no shoes, or vise versa, you will not receive service.

AND is picky about what it wants and won’t accept anything less! 😛

4 2 1
-1 -1
-1
-1 1
-1
1 -1
-1
1 1
1
OR

So when we say that AND is the operation of ‘Logical conjunction‘ OR is said to be the operation of ‘Logical disjunction‘. The training set below illustrates this by taking two inputs and will output a true (1) if either of the inputs are 1 meaning input A OR input B were 1.

To use the same illustration about the shirts and shoes sign as above, in the case of OR the sign would read: “You will only receive service if you are wearing either a shirt OR shoes, OR BOTH”.

OR is somewhat picky, but not as picky as AND.

4 2 1
-1 -1
-1
-1 1
1
1 -1
1
1 1
1

NOT

NOT is said to be an ‘Inverter‘ operation which the training set below illustrates by ‘inverting’ negative inputs into positive inputs. Additionally NOT implements ‘Negation‘.

NOT works by simply asking questions and comparing opposites.

Take the “No shirt, No Shoes, No Service” example again, now imagine it’s an AI that is determining if a visitor should receive service. It might use NOT by asking several questions…

Lets say the visitor is wearing a shirt and not wearing shoes the AI could interpret the scenario using NOT like this:

Questions Answer Meaning Input
Visitor NOT wearing Shirt? No Visitor has a shirt on 1
Visitor NOT wearing Shoes? Yes Visitor is not wearing shoes -1

Since the visitor IS wearing a shirt (1), the answer returned is false or -1 (no error). Since the visitor is NOT wearing shoes (-1) the answer returned is 1 (error).

So in this way the real questions the AI is asking is: Are there any errors? Is there any reason I should not sell this visitor a pack of gum? To which one of the answers was 1 (yes).

Then the AI could take the results from NOT and pass them to OR ‘Logical disjunction‘ which would allow the AI to determine that if any (or all) of the answers were 1. In the example above, because there was a 1 returned to the ‘Visitor NOT wearing Shoes’ question, it would not sell the pack of gum.

While NOT may be picky, it’s not satisfied with what it’s offered and always prefers the opposite.

2 1 1
-1
1
1
-1
NAND

NAND stands for Negative-AND and works the same way as AND except by using ‘Negation‘ it returns inverted outputs for true and false.

4 2 1
-1 -1
1
-1 1
1
1 -1
1
1 1
-1
NOR

NOR stands for Negative-OR and works the same way as OR except by using ‘Negation‘ it returns inverted outputs for true and false.

4 2 1
-1 -1
1
-1 1
-1
1 -1
-1
1 1
-1
XNOR

XNOR stands for Exclusive-Negative-OR and works the same way as XOR except by using ‘Negation‘ it returns inverted outputs for true and false. XNOR is often used to test for logical equality, which basically means: A == B, TRUE == TRUE, FALSE == FALSE.

XNOR Example
A B Output Meaning
-1 -1 1 (true) FALSE == FALSE = TRUE
1 1 1 (true) TRUE == TRUE = TRUE
4 2 1
-1 -1
1
-1 1
-1
1 -1
-1
1 1
1

OK, now that we’ve looked at the training sets and have an idea of how we might use each operation we can navigate back to : /php-fann/examples on c9.io:

12

Create the following files and place the correct training data from above in each file (the links below link to the correct training set above):

and.data

or.data

not.data

nand.data

nor.data

xnor.data

The php-fann Examples Folder with Training Data Added

13

Next create a new file called train_all.php.

train_all.php:
<?php
function train($data, $num_input, $num_output, $num_layers, $num_neurons_hidden, $desired_error, $max_epochs, $epochs_between_reports){
  
  $ann = fann_create_standard($num_layers, $num_input, $num_neurons_hidden, $num_output);
  if ($ann) {
    fann_set_activation_function_hidden($ann, FANN_SIGMOID_SYMMETRIC);
    fann_set_activation_function_output($ann, FANN_SIGMOID_SYMMETRIC);
    $filename = dirname(__FILE__) . '/' . $data . '.data';
    if (fann_train_on_file($ann, $filename, $max_epochs, $epochs_between_reports, $desired_error)) {
      print($data . ' trained.<br>' . PHP_EOL);
    }
    
    if (fann_save($ann, dirname(__FILE__) . '/' . $data . '_float.net')) {
      print($data . '_float.net saved.<br>' . PHP_EOL);
    }
    
    fann_destroy($ann);
  }
}
train('and', 2, 1, 3, 3, 0.001, 500000, 1000);
train('nand', 2, 1, 3, 3, 0.001, 500000, 1000);
train('nor', 2, 1, 3, 3, 0.001, 500000, 1000);
train('not', 1, 1, 3, 3, 0.001, 500000, 1000);
train('or', 2, 1, 3, 3, 0.001, 500000, 1000);
train('xnor', 2, 1, 3, 3, 0.001, 500000, 1000);
train('xor', 2, 1, 3, 3, 0.001, 500000, 1000);
print("<a href='test_all.php'>Test All</a>");
?>

Next create a file called test_all.php.

test_all.php:
<?php
function test($ann, $test_data) {
  $train_file = (dirname(__FILE__) . '/' . $ann . '_float.net');
  if (!is_file($train_file)) {
    print('The file ' . $ann . '_float.net has not been created! Please run train_all.php to generate it.<br>' . PHP_EOL);
  } else{
    $ann = fann_create_from_file($train_file);
    if ($ann) {
      $calc_out = fann_run($ann, $test_data);
      
      $num_inputs = count($test_data);
      $num_outputs = count($calc_out);
      
      $test_result = $ann . ' test (';
      for($i = 0; $i < $num_inputs; $i++) {
        $test_result .= $test_data[$i];
        if ($i < $num_inputs - 1) {
          $test_result .= ', ';
        }
      }
      $test_result .= ') -> ';
      for($i = 0; $i < $num_outputs; $i++) {
        $test_result .= $calc_out[$i];
        if ($i < $num_outputs - 1) {
          $test_result .= ', ';
        }
      }
      print($test_result . '<br>' . PHP_EOL);
      
      fann_destroy($ann);
    } else {
      die("Invalid file format" . PHP_EOL);
    }
  }
}
test('and', array(1, 1));
test('nand', array(-1, -1));
test('nor', array(-1, -1));
test('not', array(-1));
test('or', array(1, -1));
test('xnor', array(-1, -1));
test('xor', array(-1, 1));
?>

The ‘examples’ folder should now look like this:

14

So lets test everything. Click the Green Run Project button on the menu on the top of the page, if it’s already running it will say stop instead.

10

You will get a windowed browser in the IDE, click the “Popout” button so you get a nice full-size tab to preview everything in:

11

Now just navigate to: /php-fann/examples by clicking on the folders.

Once there, first run the train_all.php to train all the ANN’s. The output should look like this:

train_all.php results

15

If you collapse and then re-expand the examples folder by clicking the arrow next to the folder name all of the ANN.net files will now show up in the file listing:

All the newly created .net files

16

These files contain all the ANN data for each operation we trained and are what we use so we don’t have to train the ANN from scratch each time we want to use it.

Now all that’s left to do is to click the ‘Test All’ link at the bottom of the train_all.php results. You should see something like this:

test_all.php results

17

If you get something like the image above everything is working! Review the PHP code for train_all.php as well as test_all.php and see if you can’t see what is going on. If you have any trouble please leave a comment below and I will try to help.

My next post will use FANN & PHP to create a pathfinding neural network so be sure to follow me on Github & Twitter as well as connect with me on linkedIn.

All my posts are made possible by the generous contributions and hard work of my sponsors!

As always, thanks for reading! If this post helps you or you found it enjoyable or just have some thoughts about all of this, please leave a comment below.

~Joy