Researchers explore the mystery of neuron coding and its applications
When we experience a sensory stimulus, such as a baby crying or a bite of pizza, the neurons in our brain fire impulses rapidly to receive and interpret the signal. How these impulses are timed—and how we can replicate it artificially—has been mathematically determined by a team of researchers composed of ECE ILLINOIS and CSL Professor and ADSC director Douglas L Jones, ECE ILLINOIS alumnus Erik Johnson (BSEE ’08, MSCompE ’13, PhD CompE ’16), and Rama Ratnam. These CSL affiliates have been developing their research in the Health Care Engineering Systems Center.
“There was a major question in neuron function that had not been answered,” said Ratnam, a senior research scientist at the University of Illinois at Urbana-Champaign. “What is the neural code? That is, how is a sensory signal from the external world converted or coded in the form of a neural signal? Oddly enough, we do not really know.”
Engineers, Ratnam explains, often take coding for granted because they design the code. There are no mysteries to it, such as digitally coding the analog signal for music. However, the neural code is far older and wasn’t artificially designed by humans.
“The neural code evolved over evolutionary time scales, possibly 550-650 million years ago when box and comb jellies first came into being,” said Ratnam. “What makes the neural code hard to understand is that the encoding is not in a form that is analog or digital. We wonder, how do the neurons in the brain translate and interpret the sensory signal? Mechanically, when we hear a sound, how do we know who’s saying it, what they’re saying, and how they’re saying it, in real-time?”
The challenge researchers encountered with sensory neurons is understanding how they encode and decode—or how they receive information and send it back out as a signal for other neurons. In artificial systems, encoding and decoding are closely coupled. Systems can’t receive (encode) a JPG, for example, and play it back (decode) as an MP3 file.
“You can’t swap one format for the other in artificial systems. They have to be closely coupled,” explained Johnson. “Neuron encoding and decoding has to be closely coupled as well.”
The researchers argue that a neuron both encodes its input as a series of impulses, and simultaneously decodes the impulses so that it can compare the quality or fidelity of its coding.
“This is a radical departure from the recognized view of neural coding where encoding and decoding take place in separate neurons,” said Ratnam.
Why is this important? In the digital coding of analog signals there is a cost associated with coding of the signal. The better the quality of coding, the higher the so-called bit-rate, and greater the cost. Digital coding is a trade-off between the bit-rate and the quality that we can live with. The team argues that this is the case with neural coding as well.
They have determined that neurons have effectively found the most efficient strategy to encode and transmit analog signals using the rate of impulse generation to control the fidelity of the coding. The fidelity is monitored using an internal decoding process through what is a called a threshold mechanism.
A key insight is that the neuron's output is regulated, providing control over the average rate of impulses and the energy consumption of neurons. This makes the process of encoding and decoding as energy-efficient as possible. Thus, for a given rate of impulse generation, a neuron will encode a signal with the highest fidelity. This is similar to source coding or data compression.
Based on this fundamental knowledge of neurons, Ratnam, Johnson, and Jones have derived the optimum mathematical code to time the impulses output by a neural code.
“There is a mathematically optimum strategy for coding and transmitting signals in the most ideal way, and neurons have evolved biophysical mechanisms to find that way,” said Ratnam. "Neurons are not noisy or unreliable coders as is believed. They are precise devices and they encode information with the fidelity needed for a given physiological function.
This is, according to the researchers, biology's answer to digital coding and data compression. Without it, the brain would need too many neurons and become much too big, and it would take up too much of our metabolic energy.
Knowing this code has significant applications in prosthetics and other artificial devices.
“With this knowledge, we can work toward creating prosthetics to replace damaged sensory organs—could be eyes, ears, anything,” said Ratnam. “If we can generate the optimum way of representing the signals to stimulate the neurons in the brain, it could make the brain think it hasn’t lost anything.”
In other words, the team is building prosthetics to send signals in the same way our normal organs do so that our brain cannot tell the difference—helping to replace the functionality of what has been lost.
“This work is grounded in fundamental neuroscience work and engineering theory,” said Johnson. “And this collaboration in disciplines is showing applications in a wide range of sensory systems. It’s really exciting.”
This work has been published in Journal of Computational Neuroscience, Frontiers in Computational Neuroscience, and more, and further work is under review in PLOS Computational Biology and Ear and Hearing.
Read the original article on the CSL website.