dopetalk

Mind and Body => Neuroscience => Topic started by: Chip on May 04, 2018, 12:38:25 PM

Title: Subdominant Dense Clusters Allow for Simple Learning and ...
Post by: Chip on May 04, 2018, 12:38:25 PM
source: https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.115.128101

Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses [March 2015]

We show that discrete synaptic weights can be efficiently used for learning in large scale neural systems,
and lead to unanticipated computational performance. We focus on the representative case of learning random
patterns with binary synapses in single layer networks. The standard statistical analysis shows that this
problem is exponentially dominated by isolated solutions that are extremely hard to find algorithmically.
Here, we introduce a novel method that allows us to find analytical evidence for the existence of subdominant
and extremely dense regions of solutions. Numerical experiments confirm these findings. We also show that
the dense regions are surprisingly accessible by simple learning protocols, and that these synaptic
configurations are robust to perturbations and generalize better than typical solutions. These outcomes
extend to synapses with multiple states and to deeper neural architectures. The large deviation measure also
suggests how to design novel algorithmic schemes for optimization based on local entropy maximization.

the article continues on via the source link ...
SimplePortal 2.3.6 © 2008-2014, SimplePortal