Accurately manipulating articulated objects is a challenging yet important task for real robot applications. In this paper, we present a novel framework called Sim2Real2 to enable the robot to manipulate an unseen articulated object to the desired state precisely in the real world with no human demonstrations. We leverage recent advances in physics simulation and learning-based perception to build the interactive explicit physics model ofthe object and use it to plan a long-horizon manipulation trajectory to accomplish the task. However, the interactive model cannot be correctly estimated from a static observation. Therefore, we learn to predict the object affordance from a single-frame point cloud, control the robot to actively interact with the object with a one-step action, and capture another point cloud.Further, the physics model is constructed from the two point clouds. Experimental results show that our framework achieves about 70% manipulations with 30% relative error for common
@INPROCEEDINGS{10160370,
author={Ma, Liqian and Meng, Jiaojiao and Liu, Shuntao and Chen, Weihang and Xu, Jing and Chen, Rui},
booktitle={2023 IEEE International Conference on Robotics and Automation (ICRA)},
title={Sim2Real2: Actively Building Explicit Physics Model for Precise Articulated Object Manipulation},
year={2023},
volume={},
number={},
pages={11698-11704},
doi={10.1109/ICRA48891.2023.10160370}}