Health Desk- 24 July, 2020: Many computational properties are maximized when the dynamics of a network are at a "critical point" transitioning e.g. between order and chaos or stability and instability.
The critical state is widely assumed to be optimal for any computation in recurrent neural networks, which are used in many AI applications.
Researchers from the HBP partner Heidelberg University and the Max-Planck-Institute for Dynamics and Self-Organization challenged this assumption by testing the performance of a spiking recurrent neural network on a set of tasks with varying complexity at - and away from critical dynamics.
They instantiated the network on a prototype of the analog neuromorphic BrainScaleS-2 system. BrainScaleS is a state-of-the-art brain-inspired computing system with synaptic plasticity implemented directly on the chip. It is one of two neuromorphic systems currently under development within the European Human Brain Project.
At the initial stage, the researchers showed that the distance to criticality can be easily adjusted in the chip by changing the input strength, and then demonstrated a clear relation between criticality and task-performance.
The assumption that criticality is beneficial for every task was not confirmed.
The study thus provides a more precise understanding of how the collective network state should be tuned to different task requirements for optimal performance.
First author Benjamin Cramer of Heidelberg University said that as a next step, they were currently studying and characterizing the impact of the spiking network's working point on classifying artificial and real-world spoken words.