Technology
Liquid State Machine
A recurrent neural network framework that maps continuous input streams into high-dimensional spatio-temporal patterns using a fixed, random reservoir.
Wolfgang Maass introduced the Liquid State Machine (LSM) in 2002 to process asynchronous, time-varying data through spiking neural networks. The architecture functions by feeding inputs into a 'liquid' (a large, fixed reservoir of recurrently connected neurons) that transforms signals into a high-dimensional state space. Because the reservoir's internal weights remain untrained, the system avoids the vanishing gradient problems of traditional RNNs. A simple linear readout layer decodes the liquid's state to perform classification or regression. This approach is highly efficient for real-time tasks like speech recognition or motor control, as it leverages the inherent fading memory of the neural dynamics.
Recent Talks & Demos
Showing 1-0 of 0