On this page, I provide a copy of my MSc thesis Biologically Inspired Learning - Controlling the Neuronal Activity (PDF, 1 MB). I wrote this thesis when graduating in Computational Physics at the Institute for Theoretical Physics of the University of Amsterdam, under the supervision of Dr. W.A. van Leeuwen.
The results are also available in an article Biologically inspired learning in a layered neural net that was published in Physica A: Statistical Mechanics and its Applications, Volume 335, Issues 1-2, April 2004, Authors: Jasper Bedaux, Willem van Leeuwen, doi:10.1016/j.physa.2003.12.008, p. 279-299, see http://dx.doi.org/10.1016/j.physa.2003.12.008. If you do not have access, a preprint is freely available at the arXiv.org archive: http://www.arxiv.org/abs/cond-mat/0305650.
A layered neural net with adaptable synaptic weights and fixed threshold potentials is studied, in the presence of a global feedback signal that can only have two values, depending on whether the output of the network as a reaction to its input is right or wrong.
On the basis of four biologically motivated assumptions, it is found that only two forms of learning are possible, Hebbian and Anti-Hebbian learning. It is shown that Hebbian learning memorizes input-output relations, while Anti-Hebbian learning does the opposite: it changes the input-output relations of the network. Hebbian learning should take place when the output is right, while there should be Anti-Hebbian learning when the output is wrong.
A particular choice for the Anti-Hebbian part of the learning rule is made, which guarantees an adequate average neuronal activity. A network with non-zero threshold potentials is shown to perform its task of realizing the desired input-output relations best if it is sufficiently diluted, i.e. if only a relatively low fraction of all possible synaptic connections is realized.
If you are interested in the source codes of appendix C: they are available below. To understand the workings of the software, please have a look at the MSc thesis above, especially the documentation in the appendices. The source codes are free to use, but the original copyright notice must be preserved. The code is written in ISO standard C++ and no additional libraries are needed.
Download nnet.zip containing
|the main program file|
|needed to read inifile|
|header for input-output relations|
|library for input-output relations|
|neural network header file|
|neural network library|
|random number generator header|
|random number generator library|
|inifile containing program parameters|
The syntax highlighting is produced with C++2HTML.