Meet Pawel Jaworski, Neurorobotics Engineer III at Applied Brain Research (ABR). Pawel is a seasoned expert in developing and implementing neuromorphic and neurorobotic applications. His work on the autonomous systems team has focused on developing state-of-the-art neural networks running on neuromorphic hardware to realize novel, embedded autonomous applications. As the first hire at ABR in 2015, Pawel has been at the forefront of wrangling robots, developing neuromorphic algorithms, and building some of the world's most advanced neuromorphic applications. His research interests lie in adaptive and predictive control, sensory-motor system integration, and their applications in real-world robots and drones. Join us in recognizing Pawel's valuable contributions to the world of neuromorphic engineering and robotics!
I was offered my position at ABR back in 2015, right after the company was founded. I was actually the first person hired outside of the founders. I had just finished my undergrad in Nanotechnology Engineering, an eclectic program in its scope and depth of fields. At the time, I was having trouble finding a career that would fit my wide range of interests and skills. I was fortunate enough to have two job offers at the time, ABR and a large local company.
“Many people would have looked at the larger company as a safer bet over the small startup (at the time), but I couldn’t get over how interesting ABR’s job description sounded. It mentioned developing and prototyping a state-of-the-art robotic arm, hardware and software integration, control theory, and the whole neuromorphics edge, which was new to me.”
I have always been an engineer at heart, tinkering with things from a young age and wanting to know how they worked. As I got older, my interest in various engineering fields grew, but I was always disappointed I didn’t get to spend more time learning programming. I learned a lot in my undergrad, but one thing I’ve learned since joining ABR is that you can’t get away from software. I had used Matlab in my undergrad and some C++ through hobby projects, but the job at ABR was programming on a systems scale. I ultimately wanted to put together large integrated systems through some combination of hardware, electronics, and software.
“The job at ABR was the perfect fit and has kept my interests piqued ever since.”
I work on the Autonomous Team at ABR as a Neurorobotics Engineer level III. I've been given many unofficial titles; my favourites are "human Swiss army knife" and "robot wrangler." Mainly saying that I fall into a jack-of-all-trades role, focussing on robotic applications for the tech ABR is developing. My work can range quite a bit. I've worked on building photo-realistic simulation environments to train our robots, developed and implemented control algorithms for robotics arms and drones, and constantly dabble in new research that comes out of the labs at UWaterloo to see how it can be applied in the real world. My main responsibilities are developing robotics applications, algorithms and making real-world demonstrations.
“For example, I was part of the team at ABR to claim the title of first in the world to control a robotic arm with neuromorphic algorithms running off Intel’s neuromorphic chip, Loihi, when it was released.”
“ABR has a culture unlike anywhere I’ve worked. Some of the most brilliant people I’ve met work here and are always willing to share that knowledge.”
There have been many times when I’ve sought help in understanding many of the difficult concepts that come up in the cross-disciplinary field we work in. From the very beginning, everybody was always understanding of any gaps in knowledge and was more than willing to help teach. “
“ABR hires problem solvers and life-long learners. They understand that the power of continual learning and problem-solving can overcome any current limitations you may have. They want to be able to give you an interesting problem to solve and let you run with it.”
Before starting at ABR in 2015, I had yet to work with any neural networks or even write a single Python script. I came from an eclectic program, nanotechnology engineering, and what I mean by that is that I covered a large breadth of fields from material science, programming and simulations, electronics, and more.
Through my work at ABR, I have greatly expanded my skill set and have gained the confidence to lead the design and development of large projects as a Professional Engineer. A wise man once told me, "The way you eat a whale is one bite at a time." I have really begun to embody that saying, as many of the problems I am faced with at ABR seem daunting at first in scope. Often, the problems are open research questions, and you can't do a simple search on stack overflow or Google. However, after years of facing these problems and coming out on top, the way they get solved has remained constant: one small step at a time. This has greatly improved my problem-solving, debugging, and thinking through possible eventualities to plan the step-wise process of creating these large systems.
I grew to enjoy the work so much that in 2020 I returned to school while working to get a Master's in Systems Design Engineering. Without the previous experience I gained at ABR, doing my own research for a Master's would have felt insurmountable. In contrast, I greatly enjoyed working away at my own problem, the same way I have at ABR so many times before.
“I am currently working on what has been my favourite project to date. For the past two years, the autonomous team has been developing an autonomous search and rescue drone to aid in flood disaster scenarios.”
I began by developing a photorealistic town in Unreal Engine that can be flooded. There are civilians that exhibit the behaviour we would expect in a flood, including climbing onto rooftops, seeking high ground, clustering with other civilians, wandering etc. We added rescuers that try to find the civilians with basic AI search and navigation.
This became our test bed for developing 8 different modules for the drone. I just completed the last of these modules recently, so now I get to go onto the fun part of building the drone “brain” by combining all of these various modules. With the addition of a neuromorphic chip, the algorithms we developed for these modules would allow the entire brain to be run locally on the drone without needing an external server for processing.
Ultimately, we’ll have a drone that can perform autonomous searches for civilians in a smart and interactive way. You can talk to it with natural language and tell it things like, “People will likely be climbing buildings or searching for high ground,” The drone will use this to select optimal targets and paths to minimize search times. It will be able to autonomously localize itself and navigate these paths while adaptively flying in high-speed winds, including sudden guests and wind flow around buildings. It also uses really cool tech like spiking cameras that the drone can use to find civilians in really low light conditions, like rain storms.
On top of all of that, it has this reasoning system that it can use to predict where rescuers and civilians are moving based on their past behaviour. It can use this as an early warning system to prevent civilians and rescuers from entering hazardous scenarios. Part of the fun has been thinking about how we can combine these modules in interesting ways to expand the capabilities of the drone. I can go on and on about this project, as it’s been a really fun one and will serve a great purpose.
I think everybody really started to take notice of the power of AI with the recent releases of ChatGPT 3.5 and 4. It is amazing what these models can do and what the engineers at OpenAI were able to achieve. But what excites me is the possibility of running these larger models without needing a building filled with high-end computers. You don’t have these crazy power requirements as you do with current hardware. Did you know training GPT 3 took as much power as 120 homes would use in an entire year?
“I am very excited about the release of several upcoming chips specifically for AI, including ABR’s Time Series Processor (TSP) chips. These chips are specifically made to run AI algorithms efficiently, and will let us run a whole new level of advanced AI with a significant reduction in carbon footprint.”
Similar to how Microsoft came along in the 70s and created a user-friendly way of interacting with these complicated new machines called computers, I see ABR bridging the gap for getting neural networks onto edge hardware. However, ABR has the advantage of developing both the chip and the compiling software. Unlike many hardware companies, ABR has been developing Nengo, a software compiler for neural networks to run on all sorts of hardware, for 10+ years.
“We’ve recently released a beta version of NengoEdge, which provides a no-code method for training and deploying advanced AI on different edge devices including Arm’s m4 and m7 cores, Google’s TPU, and ABR’s upcoming TSPs. All of the quantization and hardware specific training necessary is automatically handled, and you get out performance, latency, and power measurements without needing to buy and learn how to program all this hardware yourself.”
We’re able to build this because of the unique experience of the ABR team, which has been developing algorithms and applications for neuromorphic and edge hardware in AI for over a decade. In fact, due to the experience in neuromorphics at ABR, Intel reached out to us when they released the Loihi to figure out how to compile neural networks onto it. Within a few months, we had demos of adaptive control on a robotic arm, a basic reasoning system that could play tic tac toe, and a keyword spotter.
“With all of the lessons we’ve learned working with various chips, the hardware team at ABR has been hard at work developing two of our own chips. I’m very excited to start playing with these and putting them onto all of my robots. ABR is going to take AI out of the cloud and put it into the hands of the user.”