» Articles » PMID: 32210131

Object Identification for Task-Oriented Communication with Industrial Robots

Overview
Journal Sensors (Basel)
Publisher MDPI
Specialty Biotechnology
Date 2020 Mar 27
PMID 32210131
Citations 2
Authors
Affiliations
Soon will be listed here.
Abstract

To make the human-robot collaboration effective, it may be necessary to provide robots with "senses" like vision and hearing. Task-oriented man-machine speech communication often relies on the use of abstract terms to describe objects. Therefore it is necessary to correctly map those terms into images of proper objects in a camera's field of view. This paper presents the results of our research in this field. A novel method for contour identification, based on flexible editable contour templates (FECT), has been developed. We demonstrate that existing methods are not appropriate for this purpose because it is difficult to formulate general rules that humans employ to rank shapes into proper classes. Therefore, the rules for shape classification should be individually formulated by the users for each application. Our aim was to create appropriate tool facilitating formulation of those rules as it could potentially be a very labor-intensive task. The core of our solution is FCD (flexible contour description) format for description of flexible templates. Users will be able to create and edit flexible contour templates, and thus, adjust image recognition systems to their needs, in order to provide task-oriented communication between humans and robots.

Citing Articles

Scenario-Based Programming of Voice-Controlled Medical Robotic Systems.

Rogowski A Sensors (Basel). 2022; 22(23).

PMID: 36502220 PMC: 9738457. DOI: 10.3390/s22239520.


Integration of Industrially-Oriented Human-Robot Speech Communication and Vision-Based Object Recognition.

Rogowski A, Bieliszczuk K, Rapcewicz J Sensors (Basel). 2020; 20(24).

PMID: 33353038 PMC: 7767307. DOI: 10.3390/s20247287.

References
1.
Astua C, Barber R, Crespo J, Jardon A . Object detection techniques applied on mobile robot semantic navigation. Sensors (Basel). 2014; 14(4):6734-57. PMC: 4029636. DOI: 10.3390/s140406734. View

2.
Jiang P, Ishihara Y, Sugiyama N, Oaki J, Tokura S, Sugahara A . Depth Image-Based Deep Learning of Grasp Planning for Textureless Planar-Faced Objects in Vision-Guided Robotic Bin-Picking. Sensors (Basel). 2020; 20(3). PMC: 7038393. DOI: 10.3390/s20030706. View

3.
Marvel J, Norcross R . Implementing Speed and Separation Monitoring in Collaborative Robot Workcells. Robot Comput Integr Manuf. 2016; 44:144-155. PMC: 5117641. DOI: 10.1016/j.rcim.2016.08.001. View

4.
Zabalza J, Fei Z, Wong C, Yan Y, Mineo C, Yang E . Smart Sensing and Adaptive Reasoning for Enabling Industrial Robots with Interactive Human-Robot Capabilities in Dynamic Environments-A Case Study. Sensors (Basel). 2019; 19(6). PMC: 6472064. DOI: 10.3390/s19061354. View

5.
Hernandez A, Gomez C, Crespo J, Barber R . Object Detection Applied to Indoor Environments for Mobile Robot Navigation. Sensors (Basel). 2016; 16(8). PMC: 5017346. DOI: 10.3390/s16081180. View