CS team receives NSF grant to Improve perception for small robots and drones

7/24/2024 Bruce Adams

NSF awarded $1.2M to the University of Illinois Urbana-Champaign team of CS professors Deepak Vasisht, Shenlong Wang, and Elahe Soltanaghai to establish millimeter wave radar as a first-class perception and control tool for small robots and drones.

Written by Bruce Adams

Robots and drones have the same problems as humans when navigating through smoke, dust, and fog or identifying transparent surfaces like glass. NSF awarded $1.2M to the University of Illinois Urbana-Champaign team of CS professors Deepak Vasisht, Shenlong Wang, and Elahe Soltanaghai to establish millimeter wave radar as a first-class perception and control tool for small robots and drones. The trio are each researchers at the Coordinated Science Laboratory at Illinois.

Deepak Vasisht
Deepak Vasisht
Shenlong Wang
Shenlong Wang
Elahe Soltanaghai
Elahe Soltanaghai

Their proposal, “Radar-based Perception and Control for Small Autonomous Robots,” notes the difficulties that optical sensors face in obscured environments. Vasisht explained, "Traditionally, drones and robots tend to rely on cameras and vision-based sensors such as LIDARs. In general, such optical sensors are prone to many failures. They do not function well in visually degraded conditions, such as during nighttime and in tunnels. They also suffer from bad performance in adverse weather conditions like fog. Cameras can also be covered by occlusions like dirt. Camera-like sensors raise privacy concerns too, e.g., outdoor delivery drones flying over residential areas cause privacy concerns due to their ability to look into backyards and windows.” Drones and robots are faced with problems when it comes to glass and smooth surfaces, too. “For cameras and vision-based sensors, glass is transparent. So, there is always a risk that a completely camera-operated drone or robot can run into glass windows or glass buildings. We do not want that.” Vasisht said. “Given these challenges, we are proposing detailed investigations into the use of mmWave radar for drones and robots.”

A drone with a small radar sensor was added by CS student Emerson Sie.
Photo Credit: Emerson Sie
A drone with a small radar sensor was added by CS student Emerson Sie.

Millimeter wave (mmWave) radar sensors are available off the shelf for under $100 and can be packaged in small devices. “The nice thing about mmWave radar sensors is that they are compact. They can be placed directly on small robots and drones,” said Vasisht. The team proposes an approach that links signal-processing techniques to machine-learning methods for tightly integrated perception and control. Vasisht noted, "The neural networks need careful thinking – they typically require high computational capacity. We have two plans for addressing computational constraints. First, we plan to make the networks as efficient as possible so they fit on the drone or robot themselves. Second, for modules that cannot run efficiently on the drone or robot, we will explore offloading the compute to other computing units, such as a drone’s controller or a controlling computer.”

What will result, they claim, is new radar-based localization and mapping that uses neural networks and antenna array processing to create high-fidelity maps of the environment; new passive mmWave markers that can seamlessly interact with off-the-shelf radars to identify unseen objects; neural fields to generate a realistic 3D model of the environment and enable realistic simulations of radar signals in this environment; and active perception so that a robot or drone seeks optimal viewpoints or modifies hardware parameters for enhanced perception. These components on the units and in the environment form a new approach for small robots and drones to sense and react to the environment, even in visually degraded conditions, improving the robustness of robot operation.

Soltanaghai summarized, “In our project, perception and control are very closely interlinked. The information collected by the radar guides the robot. The robot’s motion and the data collected, in turn, help us control how to choose the radar parameters for optimal efficiency. We are also designing small battery-free tags that can be attached to the environment to enable the robot to understand scenes and label objects (e.g., go to the tree with tag X). The placement of these tags is also managed by our control algorithms.”

The project also has a vigorous educational and outreach component. Vasisht said, “The great thing is that students love robots and drones. So, we intend to use this project as a key educational and outreach tool for multiple levels of education – reaching out to high schools for demos to excite more kids about science and technology and including the research in our undergraduate and graduate curriculum. We will explore if we can demonstrate this in the context of our digital agriculture initiatives, such as the Farm of the Future at the University of Illinois. If such trials are successful, we will present this work to farm robotics companies and farmers through demonstrations. We also plan to open-source much of our work so that robotics researchers and engineers find it easier to incorporate mmWave radars into their designs.”

Improving the performance of small robots and drones in visually challenging environments will have a real-world impact on agriculture, rescue operations, and more. They will be able to better sense, adapt, and react to their surroundings. As Vasisht observed, “The hope is that both cameras and mmWave sensors can work together to enable robust perception and control in robots and drones.”


Share this story

This story was published July 24, 2024.