A heterosynaptic learning rule for neural networks

Frank Emmert-Streib

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

In this article we intoduce a novel stochastic Hebb-like learning rule for neural networks that is neurobiologically motivated. This learning rule combines features of unsupervised (Hebbian) and supervised (reinforcement) learning and is stochastic with respect to the selection of the time points when a synapse is modified. Moreover, the learning rule does not only affect the synapse between pre- and postsynaptic neuron, which is called homosynaptic plasticity, but effects also further remote synapses of the pre-and postsynaptic neuron. This more complex form of synaptic plasticity has recently come under investigations in neurobiology and is called heterosynaptic plasticity. We demonstrate that this learning rule is useful in training neural networks by learning parity functions including the exclusive-or (XOR) mapping in a multilayer feed-forward network. We find, that our stochastic learning rule works well, even in the presence of noise. Importantly, the mean leaxning time increases with the number of patterns to be learned polynomially, indicating efficient learning.
Original languageEnglish
Pages (from-to)1501-1520
Number of pages20
JournalINTERNATIONAL JOURNAL OF MODERN PHYSICS C
Volume17
Issue number10
Publication statusPublished - Oct 2006

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Science Applications
  • Mathematical Physics
  • Physics and Astronomy(all)
  • Statistical and Nonlinear Physics

Fingerprint Dive into the research topics of 'A heterosynaptic learning rule for neural networks'. Together they form a unique fingerprint.

Cite this