• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Zhao, Min (Zhao, Min.) | Zuo, Guoyu (Zuo, Guoyu.) (Scholars:左国玉) | Yu, Shuangyue (Yu, Shuangyue.) | Gong, Daoxiong (Gong, Daoxiong.) | Wang, Zihao (Wang, Zihao.) | Sie, Ouattara (Sie, Ouattara.)

Indexed by:

EI Scopus SCIE

Abstract:

The positional information of objects is crucial to enable robots to perform grasping and pushing manipulations in clutter. To effectively perform grasping and pushing manipulations, robots need to perceive the position information of objects, including the coordinates and spatial relationship between objects (e.g., proximity, adjacency). The authors propose an end-to-end position-aware deep Q-learning framework to achieve efficient collaborative pushing and grasping in clutter. Specifically, a pair of conjugate pushing and grasping attention modules are proposed to capture the position information of objects and generate high-quality affordance maps of operating positions with features of pushing and grasping operations. In addition, the authors propose an object isolation metric and clutter metric based on instance segmentation to measure the spatial relationships between objects in cluttered environments. To further enhance the perception capacity of position information of the objects, the authors associate the change in the object isolation metric and clutter metric in cluttered environment before and after performing the action with reward function. A series of experiments are carried out in simulation and real-world which indicate that the method improves sample efficiency, task completion rate, grasping success rate and action efficiency compared to state-of-the-art end-to-end methods. Noted that the authors' system can be robustly applied to real-world use and extended to novel objects. Supplementary material is available at .

Keyword:

deep neural networks deep learning intelligent robots

Author Community:

  • [ 1 ] [Zhao, Min]Beijing Univ Technol, Fac Informat Technol, Intelligent Robot Lab, Beijing, Peoples R China
  • [ 2 ] [Zuo, Guoyu]Beijing Univ Technol, Fac Informat Technol, Intelligent Robot Lab, Beijing, Peoples R China
  • [ 3 ] [Gong, Daoxiong]Beijing Univ Technol, Fac Informat Technol, Intelligent Robot Lab, Beijing, Peoples R China
  • [ 4 ] [Wang, Zihao]Beijing Univ Technol, Fac Informat Technol, Intelligent Robot Lab, Beijing, Peoples R China
  • [ 5 ] [Sie, Ouattara]Beijing Univ Technol, Fac Informat Technol, Intelligent Robot Lab, Beijing, Peoples R China
  • [ 6 ] [Zhao, Min]Beijing Key Lab Computat Intelligence & Intelligen, Beijing, Peoples R China
  • [ 7 ] [Zuo, Guoyu]Beijing Key Lab Computat Intelligence & Intelligen, Beijing, Peoples R China
  • [ 8 ] [Gong, Daoxiong]Beijing Key Lab Computat Intelligence & Intelligen, Beijing, Peoples R China
  • [ 9 ] [Wang, Zihao]Beijing Key Lab Computat Intelligence & Intelligen, Beijing, Peoples R China
  • [ 10 ] [Yu, Shuangyue]North Carolina State Univ, Dept Mech & Aerosp Engn, Lab Biomechatron & Intelligent Robot BIRO, Raleigh, NC USA
  • [ 11 ] [Zuo, Guoyu]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Source :

CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY

ISSN: 2468-6557

Year: 2023

Issue: 3

Volume: 9

Page: 738-755

5 . 1 0 0

JCR@2022

Cited Count:

WoS CC Cited Count: 6

SCOPUS Cited Count: 5

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 9

Affiliated Colleges:

Online/Total:486/10595995
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.