A Stereo Matching Algorithm for High‐Precision Guidance in a Weakly Textured Industrial Robot Environment Dominated by Planar Facets

Loading...
Thumbnail Image
Date
2022
Journal Title
Journal ISSN
Volume Title
Publisher
© 2022 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd
Abstract
Although many algorithms perform very well on certain datasets, existing stereo matching algorithms still fail to obtain ideal disparity images with high precision in practical robotic applications with weak or untextured objects. This greatly limits the application of binocular vision for robotic arm guidance. Traditional stereo matching algorithms suffer from disparity loss, dilation and other problems, and deep learning algorithms have weakly generalization ability, making high‐accuracy results impossible with non‐training images. We propose an algorithm that uses segments and edges as matching units. We find the mapping relationship between two‐dimensional images and three‐dimensional scenes using segments. The algorithm obtains highly accurate results in industrial robotic applications with mainly planar facets. We combine it with a deep learning algorithm to obtain very good high‐accuracy results in both general scenes and applications of industrial robots. The algorithm effectively improves the non‐linear optimization ability of traditional algorithms and generalization ability of deep learning, and provides an effective method for the binocular vision guidance of industrial robotic scenes. We used the algorithm to guide the robot arm for threading with a success rate of 70%.
Description

        
@article{
10.1111:cgf.14435
, journal = {Computer Graphics Forum}, title = {{
A Stereo Matching Algorithm for High‐Precision Guidance in a Weakly Textured Industrial Robot Environment Dominated by Planar Facets
}}, author = {
Wei, Hui
 and
Meng, Lingjiang
}, year = {
2022
}, publisher = {
© 2022 Eurographics ‐ The European Association for Computer Graphics and John Wiley & Sons Ltd
}, ISSN = {
1467-8659
}, DOI = {
10.1111/cgf.14435
} }
Citation
Collections