Robot Grasp Planning: A Learning from Demonstration-Based Approach †
Abstract
:1. Introduction
- We propose a novel LfD-based grasp-planning framework that utilizes both the contact regions and approach direction as key skills derived from a single human demonstration. This approach effectively harmonizes human intention with the environmental constraints of both the objects and the robot gripper.
- We develop an intuitive and straightforward method for detecting the contact regions and approach direction by employing a specific hand plane formed by the thumb and index fingers using a single RGB-D camera.
- We integrate human grasp skills, encompassing both the contact regions and approach direction, into an optimization problem. This integration aims to generate a stable functional grasp that aligns with human intention while considering environmental constraints.
2. Related Works
3. Proposed Approach
3.1. Grasp Skill Recognition
3.2. Grasp Synthesis
3.2.1. Grasp Optimization
Algorithm 1 Learning from demonstration-based iterative surface fitting |
3.2.2. Collision Avoidance
4. Experiments
4.1. Experiments through Simulations
4.2. Experiments with a Real Robot
5. Conclusions
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Zhang, H.; Tang, J.; Sun, S.; Lan, X. Robotic Grasping from Classical to Modern: A Survey. arXiv 2022, arXiv:2202.03631. [Google Scholar]
- Saito, D.; Sasabuchi, K.; Wake, N.; Takamatsu, J.; Koike, H.; Ikeuchi, K. Task-grasping from human demonstration. arXiv 2022, arXiv:2203.00733. [Google Scholar]
- Mandikal, P.; Grauman, K. Learning dexterous grasping with object-centric visual affordances. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 6169–6176. [Google Scholar]
- Lin, Y.; Sun, Y. Robot grasp planning based on demonstrated grasp strategies. Int. J. Robot. Res. 2015, 34, 26–42. [Google Scholar] [CrossRef]
- Geng, T.; Lee, M.; Hülse, M. Transferring human grasping synergies to a robot. Mechatronics 2011, 21, 272–284. [Google Scholar] [CrossRef]
- Wang, K.; Fan, Y.; Sakuma, I. Robot Grasp Planning from Human Demonstration. In Proceedings of the 2023 15th International Conference on Computer and Automation Engineering (ICCAE), Sydney, Australia, 3–5 March 2023; IEEE: Piscataway, NJ, USA, 2023; pp. 421–426. [Google Scholar]
- Pinto, L.; Gupta, A. Supersizing self-supervision: Learning to grasp from 50k tries and 700 robot hours. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden, 16–21 May 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 3406–3413. [Google Scholar]
- Song, S.; Zeng, A.; Lee, J.; Funkhouser, T. Grasping in the wild: Learning 6dof closed-loop grasping from low-cost demonstrations. IEEE Robot. Autom. Lett. 2020, 5, 4978–4985. [Google Scholar]
- Deng, Y.; Guo, X.; Wei, Y.; Lu, K.; Fang, B.; Guo, D.; Liu, H.; Sun, F. Deep reinforcement learning for robotic pushing and picking in cluttered environment. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 4–8 November 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 619–626. [Google Scholar]
- Zhao, W.; Queralta, J.P.; Westerlund, T. Sim-to-real transfer in deep reinforcement learning for robotics: A survey. In Proceedings of the 2020 IEEE Symposium Series on Computational Intelligence (SSCI), Canberra, Australia, 1–4 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 737–744. [Google Scholar]
- Bicchi, A.; Kumar, V. Robotic grasping and contact: A review. In Proceedings of the 2000 ICRA, Millennium Conference, IEEE International Conference on Robotics and Automation, San Francisco, CA, USA, 24–28 April 2000; Symposia Proceedings (Cat. No. 00CH37065). IEEE: Piscataway, NJ, USA, 2000; Volume 1, pp. 348–353. [Google Scholar]
- Ciocarlie, M.; Goldfeder, C.; Allen, P. Dexterous grasping via eigengrasps: A low-dimensional approach to a high-complexity problem. In Proceedings of the Robotics: Science and Systems Manipulation Workshop-Sensing and Adapting to the Real World, Atlanta, GA, USA, 30 June 2007. [Google Scholar]
- Fan, Y.; Tomizuka, M. Efficient grasp planning and execution with multifingered hands by surface fitting. IEEE Robot. Autom. Lett. 2019, 4, 3995–4002. [Google Scholar] [CrossRef]
- Dai, W.; Sun, Y.; Qian, X. Functional analysis of grasping motion. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 3507–3513. [Google Scholar]
- Mehrkish, A.; Janabi-Sharifi, F. A comprehensive grasp taxonomy of continuum robots. Robot. Auton. Syst. 2021, 145, 103860. [Google Scholar]
- Mehrkish, A.; Janabi-Sharifi, F. Grasp synthesis of continuum robots. Mech. Mach. Theory 2022, 168, 104575. [Google Scholar] [CrossRef]
- Feix, T.; Romero, J.; Schmiedmayer, H.B.; Dollar, A.M.; Kragic, D. The grasp taxonomy of human grasp types. IEEE Trans. Hum.-Mach. Syst. 2015, 46, 66–77. [Google Scholar]
- Corona, E.; Pumarola, A.; Alenya, G.; Moreno-Noguer, F.; Rogez, G. Ganhand: Predicting human grasp affordances in multi-object scenes. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 5031–5041. [Google Scholar]
- Kleeberger, K.; Bormann, R.; Kraus, W.; Huber, M.F. A survey on learning-based robotic grasping. Curr. Robot. Rep. 2020, 1, 239–249. [Google Scholar] [CrossRef]
- Ozawa, R.; Ueda, N. Supervisory control of a multi-fingered robotic hand system with data glove. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 1606–1611. [Google Scholar]
- Liu, H.; Xie, X.; Millar, M.; Edmonds, M.; Gao, F.; Zhu, Y.; Santos, V.J.; Rothrock, B.; Zhu, S.C. A glove-based system for studying hand-object manipulation via joint pose and force sensing. In Proceedings of the 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 6617–6624. [Google Scholar]
- Lakshmipathy, A.; Bauer, D.; Bauer, C.; Pollard, N.S. Contact transfer: A direct, user-driven method for human to robot transfer of grasps and manipulations. In Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA, 23–27 May 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 6195–6201. [Google Scholar]
- Karunratanakul, K.; Yang, J.; Zhang, Y.; Black, M.J.; Muandet, K.; Tang, S. Grasping field: Learning implicit representations for human grasps. In Proceedings of the 2020 International Conference on 3D Vision (3DV), Fukuoka, Japan, 25–28 November 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 333–344. [Google Scholar]
- Wang, P.; Manhardt, F.; Minciullo, L.; Garattoni, L.; Meier, S.; Navab, N.; Busam, B. DemoGrasp: Few-shot learning for robotic grasping with human demonstration. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 5733–5740. [Google Scholar]
- Rosales, C.; Ros, L.; Porta, J.M.; Suárez, R. Synthesizing grasp configurations with specified contact regions. Int. J. Robot. Res. 2011, 30, 431–443. [Google Scholar]
- Ekvall, S.; Kragic, D. Learning and evaluation of the approach vector for automatic grasp generation and planning. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Roma, Italy, 10–14 April 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 4715–4720. [Google Scholar]
- Hillenbrand, U.; Roa, M.A. Transferring functional grasps through contact warping and local replanning. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal, 7–12 October 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 2963–2970. [Google Scholar]
- Brahmbhatt, S.; Handa, A.; Hays, J.; Fox, D. Contactgrasp: Functional multi-finger grasp synthesis from contact. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 4–8 November 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 2386–2393. [Google Scholar]
- Wang, K.; Fan, Y.; Sakuma, I. Robot Programming from a Single Demonstration for High Precision Industrial Insertion. Sensors 2023, 23, 2514. [Google Scholar] [PubMed]
- Wang, K.; Tang, T. Robot programming by demonstration with a monocular RGB camera. Ind. Robot. Int. J. Robot. Res. Appl. 2023, 50, 234–245. [Google Scholar] [CrossRef]
- Cutkosky, M.R. On grasp choice, grasp models, and the design of hands for manufacturing tasks. IEEE Trans. Robot. Autom. 1989, 5, 269–279. [Google Scholar] [CrossRef]
- Fan, Y.; Lin, H.C.; Tang, T.; Tomizuka, M. Grasp planning for customized grippers by iterative surface fitting. In Proceedings of the 2018 IEEE 14th International Conference on Automation Science and Engineering (CASE), Munich, Germany, 20–24 August 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 28–34. [Google Scholar]
- Van Den Bergen, G. Proximity queries and penetration depth computation on 3d game objects. In Proceedings of the Game Developers Conference, San Jose, CA, USA, 22–24 March 2001; Volume 170, p. 209. [Google Scholar]
- Kaskman, R.; Zakharov, S.; Shugurov, I.; Ilic, S. Homebreweddb: Rgb-d dataset for 6d pose estimation of 3d objects. In Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, Seoul, Republic of Korea, 27–28 October 2019. [Google Scholar]
- Xiang, Y.; Schmidt, T.; Narayanan, V.; Fox, D. Posecnn: A convolutional neural network for 6d object pose estimation in cluttered scenes. arXiv 2017, arXiv:1711.00199. [Google Scholar]
- Hodan, T.; Haluza, P.; Obdržálek, Š.; Matas, J.; Lourakis, M.; Zabulis, X. T-LESS: An RGB-D dataset for 6D pose estimation of texture-less objects. In Proceedings of the 2017 IEEE Winter Conference on Applications of Computer Vision (WACV), Santa Rosa, CA, USA, 24–31 March 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 880–888. [Google Scholar]
Dataset | Object Name | ISF | LfD-Based ISF | ||
---|---|---|---|---|---|
(mm) | (s) | (mm) | (s) | ||
HomeDB | Jig | 13.005 | 1.191 | 5.326 | 0.798 |
Dryer | 14.600 | 1.071 | 6.974 | 0.747 | |
Cap | 9.609 | 1.990 | 5.459 | 0.975 | |
Converter | 11.387 | 2.561 | 6.404 | 0.519 | |
Motocycle | 12.032 | 2.439 | 7.285 | 0.661 | |
Dinosaur | 17.219 | 1.744 | 9.427 | 0.895 | |
Clamp | 9.439 | 1.240 | 7.395 | 0.614 | |
Cup | 10.871 | 1.867 | 8.068 | 0.771 | |
Bear | 12.758 | 1.043 | 7.215 | 0.558 | |
Average | 12.324 | 1.683 | 7.061 | 0.726 | |
YCB-V | Box | 3.457 | 0.404 | 3.560 | 0.369 |
Can | 4.193 | 0.420 | 2.551 | 0.406 | |
Mustard | 8.720 | 0.832 | 6.174 | 0.559 | |
Spam | 8.740 | 0.532 | 4.283 | 0.370 | |
Banana | 3.145 | 0.434 | 3.031 | 0.287 | |
Bowl | 8.924 | 1.238 | 4.778 | 0.862 | |
Scissor | 5.808 | 0.679 | 3.662 | 0.277 | |
Magic pen | 4.806 | 0.523 | 4.092 | 0.317 | |
Clip | 6.836 | 0.379 | 4.681 | 0.347 | |
Average | 6.070 | 0.604 | 4.090 | 0.422 | |
T-less | Bulb | 5.198 | 0.506 | 2.600 | 0.327 |
S-connector | 8.772 | 0.872 | 3.588 | 0.334 | |
C-plug | 6.968 | 1.004 | 6.525 | 0.532 | |
R-connector | 11.648 | 1.216 | 7.858 | 0.524 | |
C-connector | 10.593 | 1.562 | 6.623 | 0.639 | |
Switch | 5.159 | 0.915 | 3.409 | 0.359 | |
Box | 2.405 | 0.439 | 2.536 | 0.412 | |
Plug | 13.058 | 1.282 | 5.171 | 0.865 | |
Circle box | 7.784 | 0.526 | 5.417 | 0.329 | |
Average | 7.954 | 0.925 | 4.859 | 0.480 | |
Total average | 8.783 | 1.071 | 5.337 | 0.543 |
Object Name | Gear | Water Valve | Water Pipe | Jig |
---|---|---|---|---|
Without collision avoidance | 11/20 | 13/20 | 10/20 | 7/20 |
With collision avoidance | 20/20 | 18/20 | 19/20 | 20/20 |
External Grasp | Internal Grasp | |||||||
---|---|---|---|---|---|---|---|---|
Objects | Water Pipe | Water Valve | Converter | Toy Rabbit | Gear | Water Valve | Water Pipe | Jig |
Method 1 | 11/20 | 6/20 | 2/20 | 13/20 | 11/20 | 7/20 | 17/20 | 12/20 |
Method 2 | 15/20 | 10/20 | 13/20 | 13/20 | - | - | - | - |
Method 3 | 18/20 | 20/20 | 20/20 | 19/20 | 20/20 | 20/20 | 20/20 | 20/20 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, K.; Fan, Y.; Sakuma, I. Robot Grasp Planning: A Learning from Demonstration-Based Approach. Sensors 2024, 24, 618. https://doi.org/10.3390/s24020618
Wang K, Fan Y, Sakuma I. Robot Grasp Planning: A Learning from Demonstration-Based Approach. Sensors. 2024; 24(2):618. https://doi.org/10.3390/s24020618
Chicago/Turabian StyleWang, Kaimeng, Yongxiang Fan, and Ichiro Sakuma. 2024. "Robot Grasp Planning: A Learning from Demonstration-Based Approach" Sensors 24, no. 2: 618. https://doi.org/10.3390/s24020618