ePosterDOI Available
Shaping Low-Rank Recurrent Neural Networks with Biological Learning Rules
Pablo Crespoand 3 co-authors
Bernstein Conference 2024 (2024)
Goethe University, Frankfurt, Germany
Presentation
Date TBA
Event Information
Poster
View posterAbstract
Extensive experimental evidence shows that task-relevant neural population dynamics often evolve along trajectories constrained to low-dimensional subspaces [1, 2]. However, how these low-dimensional task representations emerge through learning, and how the neural activity interacts with synaptic plasticity is still an unresolved question. The recent theoretical framework of low-rank recurrent neural networks (lr-RNNs) provides a direct link between connectivity and dynamics by relating structured patterns embedded in the network connectivity to the resulting low-dimensional dynamics [3].
We expand upon this framework by analyzing how local plasticity rules such as Hebbian-like, applied to lr-RNNs, shape the network's connectivity and resulting dynamics in spontaneous and input-driven regimes. We identify that Hebbian-like rules result in single-rank updates which interact with the low-rank dynamics in a differential fashion which is state-dependent.
Motivated by these insights, we employ Simulation Based Inference [4] within a teacher-student paradigm to identify plasticity rules in the general polynomial class of functions over firing rates which enable learning context-dependent, low-dimensional trajectories within a single recurrent network. Our work offers insights into the potential mechanisms through which neural circuits may develop and structure the computations underlying diverse cognitive abilities.