The University of Illinois has been a champion of supercomputing since 1985, when the National Center for Supercomputing Applications (NCSA) became part of the National Science Foundation Network (NSFNET) – at a time when the internet, and modern-era computers, had just entered the early stages of development. Illinois continued to advance the computational game when the first widely used web browser, Mosaic, was built. However, there is one computer even these researchers can’t seem to beat: the human brain. For this reason, Illinois ECE student Noyan Cem Sevuktekin is looking to learn from it, instead.
In his paper, “Signal Processing Foundations for Time-Based Signal Representations: Neurobiological parallels to engineered systems designed for energy efficiency or hardware simplicity,” Sevuktekin, along with CSL faculty members Andrew Carl Singer, Lav R Varshney, and Pavan Kumar Hanumolu, delve deeper into an area not widely discussed among engineers, but that the brain has mastered: encoding information in the timing, rather than the amplitude of its information-bearing signals. While it may be a common practice to infer meaning from social communication based on the tempo, or timing of the communication, traditional approaches to engineered communication systems have used signal levels, or amplitudes, to convey information.
“To better understand time-based signal representations, consider Morse code,” shared Sevuktekin, a graduate student studying electrical and computer engineering. For example, the distress signal “SOS” can be spelled out through a sequence of three dots, three dashes, and three dots, or, in other words, three short pulses, three long pulses, and three short pulses. When communicating in Morse code, neither the volume nor strength (amplitude) of the signal serves to convey message content as does the duration of the dots and dashes. The same principle goes for non-verbal aspects of conversation, such as chronemics.
“If someone asks me a question, the answer might lie in my response,” said Singer, Associate Dean for Innovation and Entrepreneurship and the Fox Family Professor in Electrical and Computer Engineering. “However, the time it takes me to deliver any response at all might be what is more informative.”
This concept is one of the primary mechanisms that mammalian brains use to communicate. Neurons are fired, and the brain perceives information from when they fire or the rate at which these neurons fire. Exploring how the brain, a small, yet highly energy efficient and computationally adept data processor, functions can prove to be a valuable new pattern of thought in engineering, and is already starting to appear in a variety of smaller electronic devices.
Many biomedically implanted devices (think: pacemaker) and other IoT devices now employ time-based representations. This means out with the complex circuits needed for amplitude-based signals (such as digital to analog converters), and in with much simpler signal representations like pulse-width or pulse-density modulation. Why? It all comes down to power-efficiency.
“It’s a lot easier to turn a switch on and off to make a signal, and on and off multiple times for larger signals, than it is to create varying amplitudes,” Singer said. “From a biological perspective, this is not surprising, considering that the brain requires just a bowl of Cheerios and a glass of orange juice to operate.”
This research on neurobiological parallels was an extension of Sevuktekin’s master’s thesis on pulse-width modulation, and he is not done yet.
“There is a certain stochastic extension of this,” Sevuktekin explained. “The Poisson spiking model – the idea that one spike happens and then the next spike happens independently of the past – lends itself to time-based signals quite naturally. An analysis of this could be interesting to develop from an engineering perspective.”
As more devices utilize time-based signals, this paper comes as a timely and important reminder. While the engineering community has historically prioritized the amplitude of signals to the timing of them, incorporating outside fields of study and challenging traditional research methods can lead to new breakthroughs, applications, and more.
This research was funded in part by the Systems on Nanoscale Information Fabrics (SONIC) Center and the resulting paper was published by the Institute of Electrical and Electronics Engineers (IEEE). Hanumolu is the Intel Alumni Scholar in Electrical and Computer Engineering and Varshney is an assistant professor in electrical and computer engineering.
Check out the original article on the CSL site.