Peter Bouss: Dimensionality reduction with invertible neural networks
When |
Feb 28, 2023
from 05:15 PM to 05:45 PM |
---|---|
Where | Bernstein Center, Lecture Hall, ground floor, Hansastr. 9a |
Contact Name | Fiona Siegfried |
Contact Phone | 0761 203 9549 |
Add event to calendar |
vCal iCal |
Abstract
Despite the large number of active neurons in the cortex, for various brain regions, the activity of neural populations is expected to live on a low-dimensional manifold. Among the most common tools to estimate the mapping to this manifold, along with its dimension, are many variants of principal component analysis. Despite their apparent success, these procedures have the disadvantage that they assume only linear correlations and that their performance, when used as a generative model, is poor.
To be able to fully learn the statistics of neural activity and to generate artificial samples, we make use of invertible neural networks (INNs, Dinh et al., 2014/16, Kingma et al., 2018). These neural networks learn a dimension-preserving estimator of the data probability distribution. They are outstanding in comparison to generative adversarial networks (GANs) and variational autoencoders (VAEs) for their simplicity ‒ only one invertible network is learned ‒ and for their exact estimation of the likelihood due to tractable Jacobians at each building block. We aim to modify INNs such that they can discriminate relevant (in manifold) from noise (out of manifold) dimensions.
To this end, we penalize the participation of each single latent variable in the reconstruction of the data through the inverse mapping. We can thus describe the underlying manifold without the need to discard any information. We prove the validity of our modification on controlled data sets of different complexity and illustrate the power of our modified INNs by reconstructing data using only a few dimensions. We finally apply this technique to identify manifolds in EEG recordings from a data set showing high gamma activity (Schirrmeister et al., 2017), obtained from 128 electrodes during four different movement tasks.