Bill Gates made the following statement

"If you invent a breakthrough in artificial intelligence, so machines can learn, that is worth 10 Microsofts."

(The New York Times, March 3, 2004)

Mr. Gates has a public email

[email protected]

But I doubt he reads it. Perhaps nobody does. An alternative is posting these comments to ResearchGate. After all, it has been reported that he owns shares of this site.

As mentioned in my previous post (https://www.researchgate.net/post/For_neural_network_training_what_is_better_than_backpropagation) there is a very good alternative to backpropagation (BP). And you can test it. For this purpose there is a Fast Perceptron Training (FPT) freeware package available at http://www.matematica.ciens.ucv.ve/dcrespin/Pub/Crespin.zip

The algorithm was coded using C++, with a GUI developed under VB 6.0 for Windows XP.

For the preliminaries to the theory behind FPT see my papers at http://www.matematica.ciens.ucv.ve/dcrespin/Pub/CrespinEng.html or at

https://www.researchgate.net/profile/Daniel_Crespin/publications/

Possible future papers may disclose the yet unpublished algorithm and its fine details. Meanwhile, it works nicely as can be checked running FPT.

So, back to the software.Unzip the downloaded Crespin.zip and run Crespin2.exe. It autoinstalls and opens a GUI. Afterwards everything is done by clicking buttons.

Start with "Open to Build". Many numerical data files are included as examples.

Open a large file, like Sonar.vpo. If after opening with FPT a full view of the file is desired, use the "Open with Notepad" button. Then, in Notepad's Format menu, uncheck Word Wrap. You can prepare and open your own numerical data vector files, and use FPT to train networks for these. With the data file opened, a "Build" button appears. Click "Build" to calculate architecture and weights of a perfectly trained network. Keep exploring the "Open to Test", "Evaluate" and other buttons. Also, click on the proper buttons to open the .pdf tutorial, pdf glossary and pdf primer.

What is going on? Consider a numerical data file consisting of n-dimensional vectors (currently, for this FPT version, n can be up to 100, number of data vectors up to 300, so that data matrix is up to 300x100), with assigned 1/0 outputs. After you choose values of strips (=margin widths; these determine the generalization capabilities) FPT calculates architecture and weights of an n-input, multilayer (actually 3-layer), discontinuous threshold, feedforward, single output perceptron network that perfectly recognizes the data.

Beware this is not just "memorization", nor a vector support machine gadget. All depends on simple n-dimensional geometric constructions. No kernels, nor cluster analysis, nor regression, nor Vapnik regions, nor etc. etc.

The networks produced with FPT have generalization capabilities controlled by means of the values chosen for the strips. They can be contrasted with the networks obtained from any backpropagator. If wanting, proceed as follows.

First, save the FPT network as a .sgm file.with small distance factor (sort of "learning parameter", incorporated into the network, all this with just a click, since all calculations are made by the software).

Second, feed the contents of the .sgm files initial network and the Sonar.vpo training vectors to your favorite backpropagator (again, FPT DOES NOT BACKPROPAGATE).

Third, run your backpropagator and check how fast and well it converges. Convergence should be immediate.

Conversely, take any initial 60-input, single output, multilayer, single output perceptron network specified by a traditional procedure of your choice. Train by means of BP using the data vectors of Sonar.vpo. It requires very long time and most often recognition of training data by the trained network is not very reliable.

Numerical testing and comparison of results is a practical testing of the FPT learning paradigm.

Let me know your conclusions and your valuable criticism.

And ---if possible and in your opinion--- tell how much better than the above FPT should a learning software perform to be a breakthrough at the level of Mr. Gates quotation.

More Daniel Crespin's questions See All
Similar questions and discussions