With v46 ojAlgo got support for building artificial neural networks. Here’s an example of what you can do with it.
The MNIST database is a large image database of handwritten digits that is commonly used for training and testing in the field of machine learning / image categorisation. Information regarding this dataset and various results achieved is widely published.
A correctly modelled/trained neural network should be able to achieve a 5% error rate on this dataset. Most/all published results are better than that. The largest, most advanced, models have managed 0,35%. That’s almost unbelievably good. ojAlgo currently doesn’t have all the features required to build that kind of model. The model in the program listed below gets about 2.2% error rate. Here are some sample digits/images from the data set.
The program below (with its dependency on ojAlgo) can do the following:
- Read/parse the files containing the image data and labels.
- Generate the actual images so that you can inspect them. The example images above are generated with that code.
- Print images to console (to sanity check results)
- Model and train feedforward neural networks:
- Any number of layers
- Any number of input/output nodes per layer
- Choose between 5 different activator and 2 different error/loss functions
The main benefit of using ojAlgo is how easy it is to do this and get good results. Download the example code below (you also need ojAlgo v46.1.1 or later) and run it, and start modifying the network structure, learning rate and other things. (You also need to download the data files, and update the various paths in the programs.)
Console Output
TrainingANN
ojAlgo
2019-03-19
Image 0: 7 <=> 7
X++
XXXXXXXXXXXXXXX
+ ++XXXXXXXXXX+
XX+
XX
XX
+XX
XX
+XX
XX
+XX
XX
XX+
XXX
XX
XX+
+XX
XXX
+XXX
+XX
Image 1: 2 <=> 2
++XXX++
+XXXXXXX
+XXXX+XXX+
XXX XX+
XX +XX
XXX
+XXX
XXX
+XX+
XXX+
XXX
XXX+
XXX
XXX+
+XXX
XXX
XXX ++++
XXXXXXXX+++XXXXXXX+
XXXXXXXXXXXXXXX+++
+++++XXX+++
Image 2: 1 <=> 1
X+
+X
+X
X+
X
XX
XX
+XX
+X
XX
+X+
XX
XX
+X+
+X+
XX
XX
+XX
XX+
XX
Image 3: 0 <=> 0
+XX
XXX+
XXXX+
+XXXXX++
XXXXXXXXX
XXXXXXXXXXX
XXXXX+ +XX+
XXXXX+ XXX+
XXXX +XXX
XXX XXX
XX XXX+
+XX +XXX
XXX XXXX
XXX +XXXX
XXX XXXXXX
XXX XXXXXXXX
+XXXXXXXXXXX+
XXXXXXXXXX
+XXXXXX+
XXX
Image 4: 4 <=> 4
X
+X ++
XX +X
X+ +X
XX XX
+X XX
XX +XX
+X+ XX
XX +XX
X+ +X+
XX XX+
+XX++++++XXX+
XXXXXXXXXXX
+XX
+XX
+XX
+XX
+XX
XX+
X
Image 5: 1 <=> 1
X+
XXX
XX+
XXX
XXX
+XXX
XXX
XXX
+XX+
+XX
XXX
XX+
+XX
XXX
+XX+
+XX
XXX
XXX
+X+
++
Image 6: 4 <=> 4
X+
XX+ XX
XX+ +X+
XX XX
+X+ XX+
+XX +X+
+XX +XX
+XXXX +XXX+
XXXXXXXXXXXX
+XX+++ XX+
XX
+X
+XX
+X+
XX
+XX
XX+
XX++X+
XXXX+
X+
Image 7: 9 <=> 9
+X
+XXX
XXXX+
+XXXXXX+
XXXXXXXX
XX +XXXX
+X XXXX
XX+ XXX+
+XX ++XXXX
XXXXXX XX+
XXX+ +XX
++ XX+
+XX
+X+
XX
+XX
XX
X+
XX
X
Image 8: 5 <=> 5
++
+XXXXXXXX
+XXXXXXXXXXX
XXXXXXXXXXXX
+ XXX++
XX
XX+
+XX
+XX
+XX
XXX+
XXXXX++++ +
+XXXXXXXXXXXX+
XXXXXXXXXXXX+
XXXXXXXXX
+XX+ +XXX
XXXXXXX+
+XXXXXX
XXXX+
++
Image 9: 9 <=> 9
+XX+
+XXXXXXXX+
+XXXX++XXXXX+
+XXXX+ X XXXX
+XXX+ X XXX
XXX +XXXX+
XXX++XX+XXXXXXX
XXXXXXXXXXXXXX
+XX++XXXXX+
+XXX+
+XXX+
+XXXX
XXXX
+XXX
XXX+
XXXX
XXX
XXX+
XXX
+X+
=========================================================
Error rate: 0.0216
3Blue1Brown Educational Videos
3Blue1Brown has a fantastic series of videos about neural networks based on the same example as above. After watching these videos you’re very likely to want to start playing around with neural networks yourself. To get you started just copy and past the code above…




















