Effective and Fast Learning for Neuromorphic Hardware

  • IFISC Seminar

  • Robert Legenstein
  • Institute of Machine Learning and Neural Computation, Graz University of Technology, Austria.
  • April 16, 2025, 2:30 p.m.
  • IFISC Seminar Room
  • Sync with iCal
  • Announcement file
Robert Legenstein

Full professor and Institute Head, Institute of Machine Learning and Neural Computation, Graz University of Technology. Austria.

Broadcast soon

Spiking neural networks (SNNs) are the basis for many energy-efficient neuromorphic hardware systems. While the standard neuron model for spike-based computation on such systems has long been the leaky integrate-and-fire (LIF) neuron, a computationally light augmentation with an adaptation mechanism has recently been shown to exhibit superior performance on spatio-temporal processing tasks. I will first discuss the dynamical, computational, and learning properties of adaptive LIF neurons and networks thereof. I will show that challenges related to stability of this model can be effectively, allowing to improve over state-of-the-art performances on common event-based benchmark datasets. In the second part of my talk, I will discuss another important topic in the field. In contrast to current neuromorphic systems, the brain exhibits the remarkable ability to learn from a limited number of examples and to quickly adapt to new tasks. These capabilities are arguably essential for intelligent neuromorphic systems in real-world scenarios. I will present methods that allow few-shot learning in neuromorphic hardware and demonstrate their applicability to memristor-based in-memory computing hardware.



The talk will be broadcast in the following zoom link: https://zoom.us/j/98286706234?pwd=bm1JUFVYcTJkaVl1VU55L0FiWDRIUT09



Contact details:

Claudio Mirasso

Contact form


This web uses cookies for data collection with a statistical purpose. If you continue browsing, it means acceptance of the installation of the same.


More info I agree