In the Media

ABR Unveils Low-Power AI Platform for Smart Cars & Devices

December 27, 2019
Peter Suma

ABR Unviels Extended Low-Power Neuromorphic AI Platform for Smart Cars, Phones, and Devices at CES 2020

  • ABR beats the “most commercially successful AI algorithm [the LSTM]” (Bloomberg, 2018) with ABR’s brand new LMU algorithm. The Long Short-Term Memory (LSTM) algorithm has stood since 1997 as the best AI algorithm for processing time series data. It is used in smart speakers, phones, cloud services, sensors, cars and robots for everything from speech to signal processing and more.
  • The LMU is a game-changer for AI-based speech, video, control, music and signal processing, providing increased accuracy, greater scalability and lower power consumption than the LSTM.
  • ABR is working with leading car OEMs, smartphone makers, drone users and software companies to implement leading edge speech and video AI systems using our new LMUs.

Toronto, Ontario, Canada – December 27, 2019 – Applied Brain Research Inc. (ABR) is unveiling its extended low-power edge-AI platform at CES 2020 in Las Vegas, Nevada from Jan. 7th to 10th. CES is the premier showcase for consumer technologies. Applied Brain Research (ABR) has been selected as one of the emerging technology Eureka Park companies and will be exhibiting in Booth #50669 at the Sands Convention Center, Hall G.

“We are very excited to have been selected for our revolutionary, low-power edge-AI technologies” says Peter Suma, co-CEO of ABR. “ABR makes AI technologies that radically lower the power required to compute continuous AI in phones, cars, drones, robots, sensors, and ioT devices as compared to GPUs. In April, we showed a 100x reduction in power used for keyword spotting versus GPUs. Recently we significantly extended our low-power AI platform with the release of our Legendre Memory Unit (LMU), a full replacement for LSTMs. LMUs are more accurate, more scalable and lower-power replacements for LSTMs.”

The LMU is a new Recurrent Neural Network (RNN) algorithm for continuous-time signal processing that can learn to classify and predict patterns in signals far more efficiently than the LSTM, the previous state-of-the-art algorithm. The LMU surpasses the best known RNN results by LeCun et al. (93.7–94.5 percent) and Bengio et al. (95.4–95.9 percent) by a full percentage point on the psMNIST task. LMUs are also more scalable, able to learn temporal dependencies that span millions of time-steps, unlike LSTMs which are far more limited. Also unlike LSTMs, LMUs can be computed with neuromorphic hardware, using far less power to achieve the same or better results than LSTMs running on GPUs.

LMUs can be used wherever LSTMs are, such as for classification, learning, and processing of time series data in speech, control, and analysis of sensor data in robotic motion, speech, handwriting, and music processing. The game-changing benefits include longer battery life for smartphones and smart speakers, and longer range for autonomous cars due to the large reduction in AI compute power.

About Applied Brain Research Inc. (ABR)

ABR’s team of world-leading neuromorphic researchers have more than a century’s worth of combined time researching how brains compute so efficiently. ABR’s team brings that experience to developing our leading-edge AI hardware and software computing platform. ABR makes Nengo, the world’s leading multi-platform, visual neuromorphic software compiler, runtime and spiking deep-learning edge-AI platform. ABR’s team built Spaun, the world’s largest functional brain model, using Nengo. ABR also builds real-time, full-loop AI “brains” using Nengo for customers in the military, self-driving car, IoT and smartphone markets.

For more information about ABR, visit: www.AppliedBrainResearch.com

For more information about the ABR LMU

  1. ABR LMU page: https://appliedbrainresearch.com/services/lmu/
  2. 2 minute LMU explanation video: https://www.youtube.com/watch?v=8t64QaTdBcU
  3. Example using LMU in Nengo: https://www.nengo.ai/nengo/examples/learning/lmu.html
  4. Example using LMU in NengoDL: https://www.nengo.ai/nengo-dl/examples/lmu.html
  5. NeurIPs 2019 LMU paper: https://papers.nips.cc/paper/9689-legendre-memory-units-continuous-time-representation-in-recurrent-neural-networks.pdf
  6. NeurIPs 2019 spotlight LMU talk: https://www.youtube.com/watch?v=YACjcStHYGY

Media contact

Peter Suma, co-CEO
peter.suma@appliedbrainresearch.com
+1-416-505-8973

ABR awards and honors

Peter Suma, Chair of the Board and Co-Founder at Applied Brain Research. Prior to ABR, Peter led start-ups in robotics and financial services as well as managed two seed venture capital funds. Peter holds degrees in systems engineering, science, law and business.

Similar content from the ABR blog