Can robotic systems be ?trusted??
Rick Kubetz, Office of Engineering Communications
- ECE Professor Mark Spong is the director of the new Center for Autonomous Engineering Systems in the Information Trust Institute at Illinois.
- CAESAR will bring together researchers in robotics from computer science and engineering to build trustworthy autonomous systems.
- SToMP: Sensor Topology & Minimal Planning will be one of the first projects of CAESAR.
Sometimes it sounds like science fiction—turning over responsibilities to computer and robotic systems that work autonomously without human intervention. But can you trust a machine?
“Trust is a crucial attribute of autonomous agents and robotic systems,” explained Mark W Spong, director of the new Center for Autonomous Engineering Systems and Robotics (CAESAR) in the Information Trust Institute (ITI). at Illinois. “Such systems are increasingly called on to perform critical tasks with little or no human intervention, such as search-and-rescue, emergency response, surveillance, telemedicine, agricultural robotics, and coordination of multiple vehicles.”
“CAESAR will bring together researchers in robotics with those in computer science and engineering, enabling us to build large-scale, autonomous systems that we can rely on,” stated Spong, a Donald Biggar Willett Professor of Engineering in the Department of Electrical and Computer Engineering, and a research professor at the Coordinated Science Laboratory and ITI. “The Center will serve as a focus for research collaboration in this area, fostering group interaction and facilitating the formation of project teams to work on major new initiatives.”
“CAESAR is poised to make robotic systems dramatically more trustworthy, making it possible to use them in applications previously considered much too critical for autonomous operation,” remarked ITI Director Bill Sanders. Established in 2004, ITI brings together over 60 faculty and senior researchers and more than 200 graduate students across campus to advance the state-of-the-art in building systems that are quantifiably trustworthy, making them resilient to both accidental failures and malicious attacks. ITI now houses seven major centers in trust-related areas.
One of the first projects being tackled by CAESAR affiliates is SToMP: Sensor Topology & Minimal Planning, led by Professor Robert Ghrist of the Department of Mathematics. Funded by the Defense Advanced Research Projects Agency (DARPA), the SToMP program will take advantage of high-dimensional mathematical insights to create new Department of Defense capabilities to capitalize on emerging opportunities that involve sensors that are miniaturized, pervasive, and coordinated. As a multi-year, interdisciplinary project, SToMP brings together experts from the University of Illinois, Bell Labs/Lucent, Arizona State University, Rochester University, Carnegie-Mellon University, Melbourne University, the University of Pennsylvania, and the University of Chicago.
“By developing new mathematical tools and techniques, the program will solve a multiplicity of network and sensing problems that arise in a variety of military situations,” explained Robert Ghrist, principal investigator for SToMP. “Much of the current technology forces real-world problems into a one-dimensional network where the mathematical formulation has, by necessity, lost crucial information.” Both Ghrist and co-investigator Steven LaValle, an associate professor in computer science, are part of the CAESAR group.
According to Spong, the emerging area of Human-Robot Interaction (HRI) holds great promise and opportunity for CAESAR researchers.
“Developing a rigorous and robust approach to human-robot trust requires engineers, computer scientists, mathematicians, psychologists, and sociologists to work together in new and exciting ways,” Spong said. “Many tasks, such as remote manipulation and construction, cannot be accomplished by fully autonomous robots alone, because of security, complexity, remoteness, or other concerns. Instead, teams of humans and robots will need to work together, often in close proximity and often in safety-critical situations. Well-defined notions of trust between humans and machines need to be developed if such tasks are to be accomplished.”
The fruits of this research will also find their way into the classroom, preparing the next generation of students for the next generation of information systems.
“We already offer a significant number of courses, and we are looking to broaden the range of courses related to the general themes of ITI, such as critical infrastructures and homeland defense, embedded and enterprise computing, and multimedia and distributed systems.”