By David Goddard. Photography by Shawn Poynter.
Neuromorphic computing has emerged over recent years as a promising way to perform intelligent computation more efficiently and quickly than current computing can allow.
Getting there will require thinking outside the box as to what computers are, what they can be, and even if they can be taught to learn. EECS Assistant Professor Katie Schuman might just have those answers, or at least how to unlock them.
“The challenge is to be able to develop computers that can learn through data as opposed to being programmed, which would open up the possibility that you could then have a device that analyzed data in the moment, in real time, instead of the long stretches that traditional computing takes,” Schuman said. “Doing so not only gives you the results you need quicker, but with brain-inspired computation you would also have a decrease in the amount of energy you needed to use. It’s a double win.”
One of Schuman’s first steps is identifying new approaches to breakthroughs in neuromorphic computing, taking the advantages of machine learning and leveraging high-performance computing.
She has identified potential roadblocks that must be overcome if neuromorphic computing is going to be realized to its fullest potential, including issues related to accessibility of neuromorphic hardware and the time spent developing simulations, which have slowed the development of new algorithms.
Additionally, Schuman points out the need for a better knowledge base surrounding the algorithms that will define neuromorphic computing and self-learning programs.
“To get to where we want to be, we need to fully explore simulations of neuromorphic hardware and develop a way to assess algorithms and the capabilities of neuromorphic systems,” Schuman said. “Doing these things will allow us to create new systems capable of learning as they go, independent of human input. A lot of work has been done in the area of neuromorphic hardware, but the learning algorithms are a critical component for the success of neuromorphic computing moving forward.”
The US Department of Energy saw the potential in Schuman’s work, awarding her a $450,000 grant through its Office of Science to pursue her research ideas. That’s significant beyond just the funding, because the DoE operates the programs that have resulted in some of the world’s fastest machines over the years—such as Frontier, currently the world’s fastest, as well as its predecessors, all of which have been housed at ORNL.
The support highlights both the importance of the idea to the DoE and the strength of Schuman’s concepts.
“With one of the DoE’s focuses being high-performance computing and making sure the US stays at the front of research and development related to it, coming up with new breakthroughs that push boundaries of computing is vital,” she said. “And it isn’t about simply having the fastest computer but more about what that allows. Any application or use case that is collecting data is game for improvement when the data can be analyzed in real time.”
That means everything from medicine to transportation to neutron detection could benefit from her research.