DE Seminar: Christopher Kim
NIH/NIDDK
Monday, April 1, 2024 · 11 AM - 12 PM
Title: Learning to generate cortical activity in strongly recurrent spiking neural networks
Speaker: Christopher Kim
Abstract: Spiking neural
networks with balanced excitation and inhibition are widely used for capturing
canonical features of cortical activity, such as spiking variability. However,
due to their large network size and strong recurrent dynamics, it is
challenging to train these networks to perform complex tasks, hindering our
understanding of their computational properties. In this talk, we present a
recursive least-squares method that can train spiking neural networks to
generate arbitrarily complex activity patterns. We show that when a subset of
neurons embedded in a balanced excitatory-inhibitory network is trained to
produce task-related neural activities recorded from the motor cortex, the
learned activity spreads to the rest of the network, promoting distributed
representation of task variables in cortical networks. We demonstrate
GPU-implementation of the training method, which enables fast training of large
scale networks. In sum, our work opens the opportunity to develop and
investigate strongly recurrent spiking neural networks driven by experimentally
recorded neural data.