An Illinois CS team is giving robots a sense of touch

7/18/2024 Bruce Adams

Research at the Robotic Perception, Interaction, and Learning Lab (RoboPIL) at the Siebel School of Computing and Data Science focuses on enabling robots to better perceive and interact with the physical world. The team's goal is to significantly enhance robots' capabilities through world modeling, multimodal perception, and robotic foundation models.

Written by Bruce Adams

When you pick up an object, you can nearly simultaneously grasp it, feel it, weigh it, and infer its material properties. That’s not so easy for a robot.

Research at the Robotic Perception, Interaction, and Learning Lab (RoboPIL) focuses on significantly improving robots’ capabilities to perceive and interact with the physical world. Yunzhu Li was formerly a Siebel School of Computing and Data Science professor affiliated with the Coordinated Science Laboratory. Science published the paper “Intelligent Textiles Are Looking Bright,” written by Li and Yiyue Luo of M.I.T., on April 4, 2024. The paper described a “chipless fiber for wireless visual-to-digital transmission that senses interactions with the human body. The fibers can be woven into wearable fabric and could transform how people interact with the environment and with each other.” Creating scalable tactile sensors that sense, store, process, and communicate information with minimal interference with physical interactions was the challenge they faced.

Li brought his interest in tactile perception for robots from Cambridge to Urbana-Champaign when he joined the Illinois Grainger College of Engineering’s Department of Computer Science in August 2023. He says, “Developing tactile sensors is never easy. It requires interdisciplinary collaboration between people from different backgrounds. We need experts from material science, mechanical engineering, and electrical engineering to develop the sensors with the right properties, and experts from machine learning and computer vision to interpret the data.”

The RoboPIL team has been developing novel tactile sensors that are low-cost and flexible, providing detailed and extensive coverage of physical contacts. The sensor array consists of 1,024 tactile sensing units integrated into a bimanual tactile sensing system. This dense, continuous tactile sensor array provides effective feedback, including whether there is contact, how much force has been applied, and local contact patterns, such as touching a flat surface or a sharp edge. The tactile information nicely complements visual observations, and when used together, they are surprisingly effective for challenging manipulation tasks, such as handling fragile objects like eggs and fruit and in-hand manipulation of tools and utensils.

A robot arm grips an egg with care.
Photo Credit: Still from video provided by Yunzhu Li
A robot gripper handles an egg using images from textile sensors.

Another aspect of their research involves endowing robots with an intuitive understanding of their physical interactions as they work. Li explained, “Humans interact with objects that have complicated physical properties. For example, making dough and manipulating granular pieces involve complex and dynamic interactions with the environment. The reason why a human can do these tasks effortlessly is due to intuitive models of the world. We can predict how the environment is going to change if we apply specific actions. For example, how the dough will deform and how the onion pieces will move around; this predictability can help us plan our behavior to do the task.”

A robot arm arranges lettering and dumpling dough.
Photo Credit: Still from video provided by Yunzhu Li
A robot arm arranges lettering and dumpling dough.

Li illustrated his point using a video. In the video, a robot attempts to roll dough into a dumpling while being constantly interrupted by a person. “What's interesting about this video is that there is a person continually preventing the robot from doing its job. The robot is making decisions at both a high level, such as what tool to use, and a low level, such as what specific actions to take. The robot uses the current visual observation as input to understand its progress. This feedback, closed-loop system allows the robot to be robust to external disturbances, enabling it to make real-time decisions. The key factor in this work is an intuitive model that predicts how the shape of the dough will change when the robot squeezes or pinches it. This forward predictive model can then be used to solve inverse problems, such as model-based planning at both the tool and action levels.”

The RoboPIL team is made up of PhD students Yixuan Wang from ECE, and CS students Binghao Huang, Hanxiao Jiang, Kaifeng Zhang, and Shivansh Patel.  Mingtong Zhang , Baoyu Li, and Jiangwei Yu from CS are Masters students, as are Keyi Shen, Guang Yin, and Haozhe Chen from ECE. The cross-disciplinary makeup of the lab team will power their expanded lines of research.

Although Li will start a position at Columbia University in New York in the Fall of 2024, the RoboPIL team’s work is continuing at Illinois. In the future, RoboPIL’s research will take robots’ capabilities to the next level by focusing on three directions: world models, multimodal perception, and robotic foundation models. The team will explore ways to scale up world models to more complex, larger-scale environments with a diverse set of everyday household objects. They will also develop novel tactile sensors and incorporate additional modalities like language and audio for more expressive and detailed modeling of physical interactions in precision tasks. Lastly, they will harness advances in large foundation models to enhance robots’ commonsense understanding of tasks and environments, significantly boosting their capabilities.


Share this story

This story was published July 18, 2024.