Enhancing neural-network performance via assortativity

Sebastiano De Franciscis, Samuel Johnson, Joaquín J. Torres

Research output: Contribution to journalArticle

Abstract

The performance of attractor neural networks has been shown to depend crucially on the heterogeneity of the underlying topology. We take this analysis a step further by examining the effect of degree-degree correlations- assortativity-on neural-network behavior. We make use of a method recently put forward for studying correlated networks and dynamics thereon, both analytically and computationally, which is independent of how the topology may have evolved. We show how the robustness to noise is greatly enhanced in assortative (positively correlated) neural networks, especially if it is the hub neurons that store the information.

Original languageEnglish
Article number036114
JournalPhysical Review E - Statistical, Nonlinear, and Soft Matter Physics
Volume83
Issue number3
DOIs
Publication statusPublished - Mar 25 2011

ASJC Scopus subject areas

  • Condensed Matter Physics
  • Statistical and Nonlinear Physics
  • Statistics and Probability

Fingerprint Dive into the research topics of 'Enhancing neural-network performance via assortativity'. Together they form a unique fingerprint.

  • Cite this