Sebastiano de Franciscis
Poster

The effect of degree-degree correlations on attractor neural networks

The performance of attractor neural networks has been found to depend significantly on the degree distribution of the underlying topology [1]. Arguably the next most important topological property of a network is the nature of its degree-degree correlations – i.e., whether it is assortative (correlated) or disassortative (anticorrelated). Recently, our group developed a theory for correlated networks which can be used for studying dynamical network properties analytically [2]. Here, we study the effect of these correlations on a paradigmatic application of complex networks – neural systems exhibiting associative memory. Mean field analysis, supported by MC simulations, shows that correlations have important implications for the robustness of memory retrieval in heterogeneous (e.g, scale-free) networks. In particular, at low tempe ratures disassortative networks perform better than neutral or assortative ones. However, at temperatures significantly higer than the standard critical temperature for neural networks, assortative configurations are able to retain some information thanks to a core of well connected hubs. [1] J.J. Torres, M.A. Muñoz, J. Marro, and P.L. Garrido, Influence of topology on a neural network performance, Neurocomputing, 58-60 229-234 (2004) [2] S. Johnson, J.J. Torres, J. Marro, and M.A. Muñoz, The entropic origin of disassortativity in complex network, Phys. Rev. Lett, in press (2010). arXiv:1002.3286.

Return