Bert Kappen

Bert Kappen
Radboud University Nijmegen
Nijmegen, The Netherlands

Speaker of Workshop 3

Will talk about: Emerging phenomena in neural networks with dynamic synapses and their computational implications.

Bio sketch:

Prof. Bert Kappen conducts theoretical research that lies at the interface between machine learning, control theory, statistical physics, computer science, computational biology and arti cial intelligence. He has developed many novel approximate inference methods inspired by methods from statistical physics. He has pioneered the mean field analysis of stochastic neural networks with dynamical synapses, revealing up and down states and rapid switching. He has identified a novel class of non-linear stochastic control problems that can be solved using path integrals. This approach has been adapted by leading robotics groups world wide, and is recognized as an important novel approach to stochastic control. His work on mean field theory for asymmetric stochastic neural networks is at the basis of current research to find connectivity patterns in neural circuits. He is author of about 130 peer reviewed articles in scientific journals and leading conferences. 
In collaboration with medical experts, he has developed a Bayesian medical expert system, including approximate inference methods, and he has co-founded the company Promedas to commercialize this system. He is director of SNN, the Dutch foundation for Neural Networks. SNN has a long reputation for successfully applying neural network and machine learning methods in collaboration with numerous industrial partners. He has co-founded the company Smart Research bv, that offers commercial service on machine learning and that has developed the Bonaparte Disaster Victim Identification software. He is honorary faculty at the Gatsby Unit for Computational Neuroscience at University College London.

Talk abstract:

In this presentation I will review our research on the effect and computational role of dynamical synapses on feed-forward and recurrent neural networks. I will discuss a new class of dynamical memories, which result from the destabilization of learned memory attractors. This has important consequences for dynamic information processing allowing the system to sequentially access the information stored in the memories under changing stimuli. Although storage capacity of stable memories also decreases, our study demonstrated the positive effect of synaptic facilitation to recover maximum storage capacity and to enlarge the capacity of the system for memory recall in noisy conditions. Possibly, the new dynamical behavior can be associated with the voltage transitions between up and down states observed in cortical areas in the brain. We investigated the conditions for which the permanence times in the up state are power-law distributed, which is a sign for criticality, and concluded that the experimentally observed large variability of permanence times could be explained as the result of noisy dynamic synapses with large recovery times. Finally, I will discuss how short-term synaptic processes can transmit weak signals throughout more than one frequency range in noisy neural networks, displaying a kind of stochastic multi-resonance. This effect is due to competition between activity-dependent synaptic fluctuations (due to dynamic synapses) and the existence of neuron firing threshold, which adapts to the incoming mean synaptic input.