wiki:2015/signalrestoration15

Check regularly for updates.


Designing spiking recurrent neural networks performing enhanced signal restoration

Florian Schuler, Institute of Neuroinformatics, University of Zurich and ETH Zurich

Statement of the problem we want to solve

Often when dealing with artificial neural networks, statements like `the network can perform signal restoration / noise cancellation' are only based on the fact that the network does some kind of averaging from the neighbors. However, if done multiple times, there is a washing-out effect. http://www.ini.uzh.ch/~fschuler/temp/figure_signal_restoration_Schuler.png

We aim at recurrent neural networks that do signal restoration in a meaningful way, in other words not just by averaging from the neighbors, but by performing some kind of a curve-fitting process. The price to pay is that these networks only do the task for a specific class of tuning curve functions. However, concerning the biological plausibility, this seems not to be a problem, because networks in the brain are thought of as encoding the same stimulus feature in the same tuning curve shape.

In biological experiments, the Tuning curve of orientation-selective neurons in visual cortex has been found to be Gaussian-shaped. This shape of the Tuning curve is thought to be a product of the recurrent connections rather than of the noisy feedforward input. Hence, the networks we will design converge to a Gaussian-shaped activity profile from a noisy input.

Before you come

Participants who want to be active participants should bring a laptop with them. It is up to you how much time you want to spend in working on this.

Nomenclature

http://www.ini.uzh.ch/~fschuler/temp/figure_TC_Schuler_explanation.png

Illustration of convergence behavior

http://www.ini.uzh.ch/~fschuler/temp/fig_performance_network_02_FS.png

What would other decoders yield?

http://www.ini.uzh.ch/~fschuler/temp/figure_TC_Schuler_decoders.png

Neural models

Participants can work with a wide variety of neural models, in particular also with the neuronal model that they are familiar with. They can use all sorts of learning procedures to increase performance of their network.

We are working with:

I&F with and without leakage / refractory period(S.R.), (S.N.)
SRM Spike response model(N.S.)
nonlinear I&F (exp. or adaptive)
LIF with recovery variable
Nonlinear LIF with rec. var.: quadr. (Izhikevich) or exp. (adEx)M.F.Y., (S.S.), (G.M.)
reduced biophysical
biophysical (H&H); point neuron or with morphologyC.B., S.H., (S.S.)
not specifiedS.H., G.I.

For the second week we have decided to specialize on

LIFS.N., F.S.
LIF with adaptive thresholdP.W.
BiophysicalC.B, S.H.

Organization of the workgroup

We will hold this workgroup as a competition. Performance of the activity profile is measured in terms of least squares to a Gaussian fit of the non-Gaussian input.

Meetings

Wed 29.4. at 3pm: exposed the problem and divided people to work with different models.
Fri 1.5. at 3pm: discussion of the progress
Mon 4.5. at 3pm: discussion of the progress
Wed 6.5. at 3pm: discussion of the progress

Training data

This is a csv file that you can use for training / parameter tuning. There are 180 rows and 10 columns.
Column 1: preferred orientation of the cell
Column 2: target population activity profile for all the noisy inputs
Columns 3-8: noisy inputs (normalized to average input of 10 whatever unit; you might have to scale them to have sensible values for your model)
The network model must also work for:

  • shifted input (Python function numpy.roll)
  • scaled input with a scaling factor

It is up to you how you implement the input. The most straightforward thing is to take it as a current input, but you can also generate spiking input with a corresponding average rate.

Last modified 4 years ago Last modified on 05/08/15 15:21:08

Attachments (1)

Download all attachments as: .zip