In the Media

ABR's Legendre Memory Unit (LMU) sets new benchmark records

December 2, 2019
Peter Suma

ABR's Legendre Memory Unit (LMU) sets new benchmark records

  • Applied Brain Research announce a more accurate, lower-power replacement for the Long Short-Term Memory (LSTM) AI network at Vancouver NeurIPS Spotlight Talk on 10 Dec 19.
  • LMU surpasses best known Recurrent Neural Network results by LeCun et al. and Bengio et al. by a full percentage point.
  • Implemented on low-power spiking neuromorphic hardware including Braindrop and Loihi.

Toronto, Ontario, Canada – December 10, 2019 – Applied Brain Research (ABR) announce a new Legendre Memory Unit (LMU) that surpasses the best known Recurrent Neural Network (RNN) results by LeCun et al. (93.7–94.5%) and Bengio et al. (95.4–95.9%) by a full percentage point.

ABR’s LMU enables advances in ultra-low-power AI speech, vision and signal processing systems for always-on and edge-AI applications, extending battery life while making them more accurate.

ABR’s announcement demonstrates the potential to realize ultra-low-power instantiations of a large class of algorithms that learn patterns in data, spanning extraordinarily long intervals of time.

Voelker et al. (2019) found that ABR’s LMU required fewer resources and less computations, whilst providing superior memory and demonstrating state-of-the-art performance of 97.15% on a challenging RNN benchmark compared to 89.86% using LSTMs. Video

The core building block of the LMU has been implemented on low-power spiking neuromorphic hardware including Braindrop (Neckar et al., 2019) and Loihi (Voelker, 2019).

The LMU outperforms both spiking and non-spiking reservoir computers (i.e. liquid state machines and echo state machines) in efficiency and memory capacity when tasked with representing temporal windows of information (Voelker, 2019).

Existing technology fails to scale

Long Short-Term Memory (LSTM), is a form of Recurrent Neural Network (RNN) which can learn to predict sequences of data over long periods of time, more so than standard neural networks.

  • LSTMs make it possible for neural networks to learn to process data like speech, video and control signals. Speech recognition, language generation, handwriting recognition, musical composition and dynamical systems prediction are only some of the uses of LSTMs. Present in most smart speakers and voice recognition systems, LSTMs are said to be the most financially valuable AI algorithm ever invented (Bloomberg).
  • LSTMs fail when tasked with learning temporal dependencies in signals than span 1,000 time-steps or more, making them very difficult to scale up in practice.

About ABR’s breakthrough Legendre Memory Unit

ABR’s new Legendre Memory Unit (LMU) is a neuromorphic algorithm for continuous-time memory that can learn temporal dependencies over millions of time-steps or more. The Legendre Memory Unit (LMU) is a new RNN architecture that enables networks of artificial neurons to classify and predict temporal patterns far more efficiently than LSTMs.

The LMU is mathematically derived to implement the continuous-time dynamical system that optimally maintains a scale-invariant representation of time.

The ABR LMU obtains the best-known results on permuted sequential MNIST, a difficult RNN benchmark, and has been shown to scale to input sequences spanning hundreds of millions of time-steps.

The resulting patterns in spiking activity have also been linked to neural “time cells” observed in the striatum and medial prefrontal cortex in mammalian brains.

Unlike the LSTM, the LMU can be implemented using spiking neurons, thus demonstrating an algorithmic advance that is anticipated to provide leaps in efficiency for solutions to dynamical time-series problems using low-power neuromorphic devices.

About Applied Brain Research Inc. (ABR)

  • ABR is the maker of the Nengo neuromorphic software development suite which includes the world’s leading multi-platform, visual neuromorphic software compiler, runtime and spiking deep learning platform.
  • Using ABR’s neuromorphic tools, the team at ABR has built Spaun, the world’s largest functional brain model and builds real-time, full-loop AI “brains” for customers in the military, self-driving car, IoT and smartphone markets. (www.appliedbrainresearch.com)

References

Aaron R. Voelker. Dynamical Systems in Spiking Neuromorphic Hardware. PhD thesis, University of Waterloo, 2019. URL http://hdl.handle.net/10012/14625.

Alexander Neckar, Sam Fok, Ben V Benjamin, Terrence C Stewart, Nick N Oza, Aaron R Voelker, Chris Eliasmith, Rajit Manohar, and Kwabena Boahen. Braindrop: A mixed-signal neuromorphic architecture with a dynamical systems-based programming model. Proceedings of the IEEE, 107(1):144–164, 2019.

He, Yanzhang, Tara N. Sainath, Rohit Prabhavalkar, Ian McGraw, Raziel Alvarez, Ding Zhao, David Rybach et al. Streaming End-to-end Speech Recognition For Mobile Devices. In ICASSP 2019-2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 6381-6385. IEEE, 2019.

Jing, Li, Yichen Shen, Tena Dubcek, John Peurifoy, Scott Skirlo, Yann LeCun, Max Tegmark, and Marin Soljačić. Tunable efficient unitary neural networks (EUNN) and their application to RNNs. In Proceedings of the 34th International Conference on Machine Learning-Volume 70, pp. 1733-1741. JMLR. org, 2017.

Krueger, David, Tegan Maharaj, János Kramár, Mohammad Pezeshki, Nicolas Ballas, Nan Rosemary Ke, Anirudh Goyal, Yoshua Bengio, Aaron Courville, and Chris Pal. Zoneout: Regularizing RNNs by randomly preserving hidden activations. arXiv preprint arXiv:1606.01305 (2016).

LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature 521, no. 7553 (2015): 436.

Sarath Chandar, Chinnadhurai Sankar, Eugene Vorontsov, Samira Ebrahimi Kahou, and Yoshua Bengio. Towards non-saturating recurrent units for modelling long-term dependencies. arXiv preprint arXiv:1902.06704, 2019.

Media contacts

North America
Peter Suma, co-CEO
peter.suma@appliedbrainresearch.com
+1-416-505-8973

EMEA
Oliver Morgan
oliver.morgan@tailoredbrands.co.uk
+44 7957 489928

Peter Suma, Chair of the Board and Co-Founder at Applied Brain Research. Prior to ABR, Peter led start-ups in robotics and financial services as well as managed two seed venture capital funds. Peter holds degrees in systems engineering, science, law and business.

Similar content from the ABR blog