Evolving artificial neural networks with feedback
Created by W.Langdon from
gp-bibliography.bib Revision:1.8051
- @Article{HERZOG:2020:NN,
-
author = "Sebastian Herzog and Christian Tetzlaff and
Florentin Woergoetter",
-
title = "Evolving artificial neural networks with feedback",
-
journal = "Neural Networks",
-
volume = "123",
-
pages = "153--162",
-
year = "2020",
-
ISSN = "0893-6080",
-
DOI = "doi:10.1016/j.neunet.2019.12.004",
-
URL = "http://www.sciencedirect.com/science/article/pii/S089360801930396X",
-
keywords = "genetic algorithms, genetic programming, Deep
learning, Feedback, Transfer entropy, Convolutional
neural network",
-
abstract = "Neural networks in the brain are dominated by
sometimes more than 60percent feedback connections,
which most often have small synaptic weights. Different
from this, little is known how to introduce feedback
into artificial neural networks. Here we use transfer
entropy in the feed-forward paths of deep networks to
identify feedback candidates between the convolutional
layers and determine their final synaptic weights using
genetic programming. This adds about 70percent more
connections to these layers all with very small
weights. Nonetheless performance improves substantially
on different standard benchmark tasks and in different
networks. To verify that this effect is generic we use
36000 configurations of small (2-10 hidden layer)
conventional neural networks in a non-linear
classification task and select the best performing
feed-forward nets. Then we show that feedback reduces
total entropy in these networks always leading to
performance increase. This method may, thus, supplement
standard techniques (e.g. error backprop) adding a new
quality to network learning",
- }
Genetic Programming entries for
Sebastian Herzog
Christian Tetzlaff
Florentin Woergoetter
Citations