1/29/2025 Lauren Laws 4 min read
For Maxim Raginsky, curiosity is a driving factor of his research, leading to advancements in machine learning, optimization, and control. It has earned him a variety of accolades, now including an honor reserved for only a select few: being named an IEEE Fellow in the class of 2025.
Written by Lauren Laws
Curiosity is a gateway to discovery and innovation. It’s a key component of breakthroughs and ideas, and helps push through setbacks.
For Maxim Raginsky, curiosity is a driving factor of his research, leading to advancements in machine learning, optimization, and control. It has earned him a variety of accolades, now including an honor reserved for only a select few: being named an IEEE Fellow in the class of 2025.
According to the IEEE, Fellows are named for their important contributions in the advancement or application of engineering, science and technology, bringing realizations of significant value to society at large. Raginsky was honored “for contributions to information-theoretic analysis of stochastic systems in optimization and machine learning.”
He has spent two decades exploring the limits and design of algorithms and systems and specializing in research areas such as deterministic and stochastic dynamical systems in machine learning, statistical machine learning, information theory and more.
“I was using information theory in order to pinpoint fundamental limitations of data collection, data processing and other components and resources in order to make better decisions, better predictions, or to achieve some control goals without full knowledge of what it is that you’re controlling,” said Raginsky, a professor in Illinois Grainger Engineering’s Department of Electrical and Computer Engineering and a researcher in the Coordinated Science Laboratory.
One of Raginsky’s influential research contributions involves the use of information theory to quantify how well machine learning algorithms can generalize from the data on which they had been trained to previously unseen scenarios. His method has become standard in the field.
“If the algorithm’s decision is so sensitive to the particular data samples it sees, then you would not expect the algorithm to generalize well. Information theory gives you a way of making this intuition precise and quantitative,” said Raginsky. “The philosophy here is about using information theory to derive fundamental limits, but it also gives you a guide for designing better algorithms because you can explicitly build on this notion of not processing all the data in such a way that the algorithm’s decision depends sensitively on each individual sample.”
Raginsky admits that once he becomes interested in a topic, he learns as much about it as possible. However, curiosity isn’t the only thing that drives him.
“I really appreciate being able to pursue this kind of curiosity-driven research, but, on the other hand, I am fundamentally interested in what can actually be accomplished with reasonable physical resources, because that is the essence of engineering,” he said.
Raginsky credits his environment for fostering his ability to dive as deeply as he wants into his work and describes the collaborative culture at Illinois as “very rare.”
“I really value the intellectual environment here because it’s a unique place where you can pursue this kind of curiosity-driven research. There are like-minded people around you that you can bounce ideas off, get inspired by them, but also get reality checks,” said Raginsky. “This highly collaborative setting has been a tremendous benefit to me, both intellectually and professionally.”
Maxim Raginsky received the B.S. and M.S. degrees in 2000 and the Ph.D. degree in 2002 from Northwestern University, all in Electrical Engineering. He has held research positions with Northwestern, the University of Illinois Urbana-Champaign (where he was a Beckman Foundation Fellow from 2004 to 2007), and Duke University. In 2012, he returned to the U. of I., where he is currently a Professor with the Department of Electrical and Computer Engineering and the Coordinated Science Laboratory. He also holds a courtesy appointment with the Department of Computer Science. His interests cover probability and stochastic processes, deterministic and stochastic control, machine learning, optimization, and information theory. Much of his recent research is motivated by fundamental questions in modeling, learning, and simulation of nonlinear dynamical systems, with applications to advanced electronics, autonomy, and artificial intelligence.