THE IMPACT OF SPARSITY IN LOW-RANK RECURRENT NEURAL NETWORKS.

The impact of sparsity in low-rank recurrent neural networks.

The impact of sparsity in low-rank recurrent neural networks.

Blog Article

Neural population dynamics are often highly coordinated, allowing task-related computations to be understood as neural trajectories through low-dimensional subspaces.How the network connectivity and input structure give rise to such activity can be investigated with the aid of low-rank recurrent neural networks, a recently-developed class of computational models which offer a rich theoretical framework linking the underlying connectivity structure to emergent low-dimensional dynamics.This framework has so far relied on the assumption of all-to-all connectivity, yet cortical networks are known to be highly sparse.Here we investigate the dynamics of low-rank recurrent networks in which the connections are randomly sparsified, which makes the network connectivity formally full-rank.We first TRAVEL CARE PROBIOTIC analyse the impact of sparsity on the eigenvalue spectrum of low-rank connectivity matrices, and use this to examine the implications for the dynamics.

We find that in the presence KNITWEAR URBAN HOODIES of sparsity, the eigenspectra in the complex plane consist of a continuous bulk and isolated outliers, a form analogous to the eigenspectra of connectivity matrices composed of a low-rank and a full-rank random component.This analogy allows us to characterise distinct dynamical regimes of the sparsified low-rank network as a function of key network parameters.Altogether, we find that the low-dimensional dynamics induced by low-rank connectivity structure are preserved even at high levels of sparsity, and can therefore support rich and robust computations even in networks sparsified to a biologically-realistic extent.

Report this page