Perception for General-purpose Robot Manipulation

Authors

  • Karthik Desingh University of Minnesota

DOI:

https://doi.org/10.1609/aaai.v37i13.26802

Keywords:

New Faculty Highlights

Abstract

To autonomously perform tasks, a robot should continually perceive the state of its environment, reason with the task at hand, plan and execute appropriate actions. In this pipeline, perception is largely unsolved and one of the more challenging problems. Common indoor environments typically pose two main problems: 1) inherent occlusions leading to unreliable observations of objects, and 2) the presence and involvement of a wide range of objects with varying physical and visual attributes (i.e., rigid, articulated, deformable, granular, transparent, etc.). Thus, we need algorithms that can accommodate perceptual uncertainty in the state estimation and generalize to a wide range of objects. Probabilistic inference methods have been highly suitable for modeling perceptual uncertainty, and data-driven approaches using deep learning techniques have shown promising advancements toward generalization. Perception for manipulation is a more intricate setting requiring the best from both worlds. My research aims to develop robot perception algorithms that can generalize over objects and tasks while accommodating perceptual uncertainty to support robust task execution in the real world. In this presentation, I will briefly highlight my research in these two research threads.

Downloads

Published

2023-09-06

How to Cite

Desingh, K. (2023). Perception for General-purpose Robot Manipulation. Proceedings of the AAAI Conference on Artificial Intelligence, 37(13), 15435-15435. https://doi.org/10.1609/aaai.v37i13.26802