2024-03-29T05:13:50Zhttp://digital.csic.es/dspace-oai/requestoai:digital.csic.es:10261/305172018-08-22T12:37:50Zcom_10261_106com_10261_4col_10261_359
DIGITAL.CSIC
author
Wells, Gordon
author
Venaille, Christophe
author
Torras, Carme
2010-12-17T12:59:52Z
2010-12-17T12:59:52Z
1996
Image and Vision Computing 14(10): 715-732 (1996)
0262-8856
http://hdl.handle.net/10261/30517
10.1016/0262-8856(96)89022-6
Most vision-based robot positioning techniques rely on analytical formulations of the relationship between the robot pose and the projected image coordinates of several geometric features of the observed scene. This usually requires that several simple features such as points, lines or circles be visible in the image, which must either be unoccluded in multiple views or else part of a 3D model. Featurematching algorithms, camera calibration, models of the camera geometry and object feature relationships are also necessary for pose determination. These steps are often computationally intensive and error-prone, and the complexity of the resulting formulations often limits the number of controllable degrees of freedom. We provide a comparative survey of existing visual robot positioning methods, and present a new technique based on neural learning and global image descriptors which overcomes many of these limitations. A feedforward neural network is used to learn the complex implicit relationship between the pose displacements of a 6-dof robot and the observed variations in global descriptors of the image, such as geometric moments and Fourier descriptors. The trained network may then be used to move the robot from arbitrary initial positions to a desired pose with respect to the observed scene. The method is shown to be capable of positioning an industrial robot with respect to a variety of complex objects with an acceptable precision for an industrial inspection application, and could be useful in other real-world tasks such as grasping, assembly and navigation.
eng
closedAccess
Visual servoing
Robot control
Neural networks
Image features
Pattern recognition
Vision-based robot positioning using neural networks
artÃculo
TGljZW5jaWEgQ1NJQyAKClBhcmEgcXVlIGVsIHJlcG9zaXRvcmlvIERpZ2l0YWwuQ1NJQyBwdWVkYSBhbG1hY2VuYXIgeSBkaXN0cmlidWlyIGVsIG9iamV0byBkaWdpdGFsIGRlcG9zaXRhZG8sIAplcyBuZWNlc2FyaW8gcXVlIGxhIHBlcnNvbmEgcXVlIGhhZ2EgZWwgZGVww7NzaXRvIGxlYSB5IGFjZXB0ZSBsYXMgY29uZGljaW9uZXMgZXN0YWJsZWNpZGFzIGVuIAplc3RhIGxpY2VuY2lhOiAKCkVsL2xvcyBhdXRvci9lcyBvIHBvc2VlZG9yL2VzIGRlbCBjb3B5cmlnaHQgZGVsIHRyYWJham8gZGVwb3NpdGFkbyBvIGVuIHN1IGNhc28gbGEgcGVyc29uYSAKZGVsZWdhZGEgcGFyYSBoYWNlcmxvLCBnYXJhbnRpemEgYWwgQ1NJQyBlbCBkZXJlY2hvIG5vIGV4Y2x1c2l2byBwYXJhIGRpc3RyaWJ1aXIsIGFsbWFjZW5hciB5IApwcmVzZXJ2YXIgZW4gZm9ybWF0byBlbGVjdHLDs25pY28gZWwgb2JqZXRvIGRpZ2l0YWwgZGVwb3NpdGFkby4KCkVsIGRlcG9zaXRhbnRlLCBlbiBjYXNvIGRlIHVuYSBvYnJhIGNvbiBtw6FzIGRlIHVuIGF1dG9yLCBnYXJhbnRpemEgcXVlIGxvIGhhY2UgcmVzcG9uc2FibGVtZW50ZSAKZW4gbm9tYnJlIHkgY29uIGNvbnNlbnRpbWllbnRvIGRlIGxvcyBkZW3DoXMgY29hdXRvcmVzLgoKRGVjbGFyYSBxdWUgc2UgdHJhdGEgZGUgdW4gdHJhYmFqbyBvcmlnaW5hbCB5IG5vIGVzdGEgc3VqZXRvIGEgcmVzdHJpY2Npb25lcyBkZSBjb3B5cmlnaHQgY29uIAp0ZXJjZXJvcyBwYXJhIHBvZGVyIG90b3JnYXIgYWwgQ1NJQyBsb3MgZGVyZWNob3MgcmVxdWVyaWRvcyBlbiBlc3RhIGxpY2VuY2lhLgoKU2kgZWwgdHJhYmFqbyBkZXBvc2l0YWRvIGNvbnRpZW5lIG1hdGVyaWFsIGRlbCBxdWUgZWwgYXV0b3Igbm8gcG9zZWUgZWwgY29weXJpZ2h0LCBlbCBhdXRvciAKZGVjbGFyYSBxdWUgaGEgb2J0ZW5pZG8gZWwgcGVybWlzbyBuZWNlc2FyaW8gZGVsIHByb3BpZXRhcmlvIGRlbCBjb3B5cmlnaHQgcGFyYSBnYXJhbnRpemFyIGFsIApDU0lDIGxvcyBkZXJlY2hvcyBkZXNjcml0b3MgZW4gZXN0YSBsaWNlbmNpYSwgeSBxdWUgZWwgcG9zZWVkb3IgZGVsIGNvcHlyaWdodCBlc3TDoSBjbGFyYW1lbnRlIAppZGVudGlmaWNhZG8geSByZWNvbm9jaWRvIGVuIGVsIHRleHRvIG8gY29udGVuaWRvIGRlbCBhcmNoaXZvIGRlcG9zaXRhZG8uCgpFbCBhdXRvciBhY2VwdGEgcXVlIGVsIENTSUMgcHVlZGUsIHNpbiByZWFsaXphciBjYW1iaW9zIGVuIGVsIGNvbnRlbmlkbywgY29udmVydGlyIGVsIHRyYWJham8gYSAKY3VhbHF1aWVyIG1lZGlvIG8gZm9ybWF0byBjb24gb2JqZXRpdm9zIGRlIHByZXNlcnZhY2nDs24uCgpBc2ltaXNtbyBlbCBhdXRvciBhY2VwdGEgcXVlIGVsIENTSUMgcHVlZGUgY29uc2VydmFyIG3DoXMgZGUgdW5hIGNvcGlhIGRlIGVzdGUgdHJhYmFqbyBwYXJhIGdhcmFudGl6YXIgCmxhIHNlZ3VyaWRhZCB5IGxhIHByZXNlcnZhY2nDs24gZGUgbG9zIGFyY2hpdm9zLgoKRWwgQ1NJQyBwcmVzZXJ2YXLDoSB5IGRpZnVuZGlyw6EgZXN0ZSB0cmFiYWpvLiBFbiBlbCBjYXNvIGRlIHF1ZSBubyBwdWVkYSBjb250aW51YXIgbWFudGVuaWVuZG8gZWwgCmFyY2hpdm8gY29tbyBwYXJ0ZSBkZWwgcmVwb3NpdG9yaW8gaW5zdGl0dWNpb25hbCBzZSByZXNlcnZhIGVsIGRlcmVjaG8gZGUgZGV2b2x2ZXIgZWwgY29udGVuaWRvIGFsIApkZXBvc2l0YW50ZS4gU2kgZXN0byBubyBlcyBwb3NpYmxlIChwb3JxdWUgbGEgY29tdW5pZGFkLCBjb2xlY2Npw7NuIGV0Yy4geWEgbm8gZXhpc3RhIG8gZWwgYXV0b3Igbm8gCmVzdMOpIGxvY2FsaXphYmxlKSwgZWwgbWF0ZXJpYWwgcG9kcsOtYSBzZXIgYXJjaGl2YWRvIGNvbW8gcGFydGUgZGVsIGFyY2hpdm8gZGlnaXRhbCBkZSBsYSBpbnN0aXR1Y2nDs24uIAoKU2kgbGEgY29udHJpYnVjacOzbiBzZSBiYXNhIGVuIHRyYWJham9zIGZpbmFuY2lhZG9zIG8gcGF0cm9jaW5hZG9zIHBvciBvcmdhbml6YWNpb25lcyBkaXN0aW50YXMgYWwgCkNTSUMsIGRlY2xhcmEgaGFiZXIgY3VtcGxpZG8gY29uIGN1YWxxdWllciBkZXJlY2hvIHkgb2JsaWdhY2nDs24gZXhwcmVzYWRvcyBlbiBlbCBjb250cmF0byBvIGFjdWVyZG8gCmNvbiBkaWNoYXMgb3JnYW5pemFjaW9uZXMuIAoKRWwgbm9tYnJlIGRlbCBkZXBvc2l0YW50ZSBxdWVkYXLDoSBjbGFyYW1lbnRlIGlkZW50aWZpY2FkbyBwb3IgZWwgQ1NJQyBjb21vIGVsIGRlbCBhdXRvciBvIHByb3BpZXRhcmlvIApkZSBsYSBjb250cmlidWNpw7NuLCB5IGVsIENTSUMgbm8gcmVhbGl6YXLDoSBuaW5ndW5hIGFsdGVyYWNpw7NuIGRlIHN1IGNvbnRyaWJ1Y2nDs24sIGV4Y2VwdG8gbGFzIHJlZmVyaWRhcyAKYWwgZm9ybWF0bywgcGVybWl0aWRhcyBwb3IgZXN0YSBsaWNlbmNpYS4K