Uncalibrated Workpiece Positioning Method for Peg-in-hole Assembly Using an Industrial Robot
DOI:
Author:
Affiliation:

School of Mechanical Engineering, Dalian University of Technology, Dalian 116024;
Dalian Dahuazhongtian Technology Co., Ltd, Dalian 116024

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    This paper proposes an uncalibrated workpiece positioning method for peg-in-hole assembly of a device using an industrial robot. Depth images are used to identify and locate the workpieces when a peg-in-hole assembly task is carried out by an industrial robot in a flexible production system. First, the depth image is thresholded according to the depth data of the workpiece surface so as to filter out the background interference. Second, a series of image processing and the feature recognition algorithms are executed to extract the outer contour features and locate the center point position. This image information, fed by the vision system, will drive the robot to achieve the positioning, approximately. Finally, the Hough circle detection algorithm is used to extract the features and the relevant parameters of the circular hole where the assembly would be done, on the color image, for accurate positioning. The experimental result shows that the positioning accuracy of this method is between 0.6-1.2 mm, in the used experimental system. The entire positioning process need not require complicated calibration, and the method is highly flexible. It is suitable for the automatic assembly tasks with multi-specification or in small batches, in a flexible production system.

    Reference
    Related
    Cited by
Get Citation

Dong LIU, Fukang ZHU, Ming CONG, Yu DU.[J]. Instrumentation,2019,6(4):26-36

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: October 29,2020
  • Published:
License
  • Copyright (c) 2023 by the authors. This work is licensed under a Creative
  • Creative Commons Attribution-ShareAlike 4.0 International License.