From Dr. Chris Eliasmith of ABR: We have recently discovered a new kind of neural network, called a Legendre Memory Unit (LMU) that is provably optimal for compressing streaming time series data. In this talk, I describe this network, and a variety of state-of-the-art results that have been set using the LMU. I will include recent results on speech and language applications that demonstrate significant improvements over transformers. I will also describe the new ASIC design we have developed to implement this architecture directly in hardware, enabling new large-scale functionality at extremely low power and latency.Professor Chris Eliasmith is co-CEO and President of Applied Brain Research, an advanced AI company.
He is the co-inventor of the Neural Engineering Framework (NEF), the Nengo neural development environment, and the Semantic Pointer Architecture, all of which are dedicated to leveraging our understanding of the brain to advance AI efficiency and scale. His team has developed Spaun, the world's largest functional brain simulation. He won the prestigious 2015 NSERC Polanyi Award for this research.Chris has published two books, over 120 journal articles and patents, and holds the Canada Research Chair in Theoretical Neuroscience. He is jointly appointed in the Philosophy, Systems Design Engineering faculties, as well being cross-appointed to Computer Science. He is the founding director of the Centre for Theoretical Neuroscience (CTN) at the University of Waterloo. Chris has a Bacon-Erdos number of 8. Watch the talk with Dr. Chris Eliasmith, co-founder and co-CEO of ABR, hosted by San Francisco Bay ACM.