Stuttgarter Beiträge zur Produktionsforschung, Band 44
Hrsg.: Fraunhofer IPA, Stuttgart
2015, 159 S., zahlr. Abb. u. Tab., Softcover
Stuttgart, Univ., Diss., 2015
The objective of this thesis is the development of a model-based object recognition system for the 6 degrees of freedom localization of typical rigid household objects that explicitly enables an intuitive teaching of new objects. When considering the perceptual process of object recognition in its entirety, it may be divided into the three main areas: data acquisition, object modeling and object localization. The different areas are examined individually and distinct contributions to each of them are presented and evaluated.
The originating conditions for the recognition process system are one-shot images of range and color data. Considering data acquisition, it is most often taken for granted that a sensor delivers directly 2.5D data or color information. However, when combining different sensor modalities, it is possible to exceed the data quality of a single sensor. The thesis follows this idea and presents a novel sensor fusion technique for data acquisition that combines the 2.5D input data from a stereo and a range imaging system.
Regarding object modeling, the thesis presents a method for dense object modeling directly on the robot using its manipulator and camera system. Additionally, two stand-alone training setups are introduced which avoid the explicit need of a robotic manipulator for object modeling. One is using a turn table to rotate the object in front of the camera system and the other one is using a chessboard where the camera is manually moved around a stationary object. Initial work conducted within the scope of this thesis proposes a fastSLAM-based in-gripper object modeling approach which is able to cope with multi-occurrences of similar textures on the object's surface. This approach is further developed and the information filter is replaced by a bundle adjustment algorithm that enables a faster registration of the individual object views.
This thesis proposes two novel binary descriptors for textured and texture-less object modeling that enable the usage of rapid bit operations to accelerate the descriptor computations. When addressing textured objects, recent fast-to-compute descriptors achieving remarkable recognition rates have been presented. This thesis proposes a scale invariant extension of the binary feature descriptor ORB, which is fast to compute while still being as descriptive as SURF. In order to distinctly describe texture-less objects, a global histogram-based descriptor is presented, that aggregates 2D and 3D gradient information from a local binary descriptor. Compared to the current state-of-the art, the descriptor exhibits scale and rotation invariance. Additionally, the underlying binary descriptor is computed faster than competing methods by the use of dynamic programming.
In order to increase the robustness of texture-less object recognition, data association is subject to a spatial constraint to take account for the spatial expansion of an object. The thesis proposes an adaptive sliding window approach to build up a probability map for prominent object locations. Based on a non-maximum suppression algorithm, the dominant object locations are selected.
The different components have been integrated in a software framework for 6 DoF object recognition that has been implemented on the service robot Care-O-bot. 3 using the middleware ROS. The software components for data acquisition, object modeling and object recognition are evaluated individually using standard datasets and typical real world household objects like plates, bottles or cups.
Fraunhofer IPA, computer vision, object recognition, object modeling, Objektmodellierung, Merkmalsextraktion, Sensorfusion, Haushaltsger?t, Positionsbestimmung, Lageerkennung, sensor fusion, Sensorfusion, Objekterkennung, Bildverarbeitung, Bilderkennung, Datenverarbeitung,
* Alle Preise verstehen sich inkl. der gesetzlichen MwSt. Lieferung deutschlandweit und nach Österreich versandkostenfrei. Informationen über die Versandkosten ins Ausland finden Sie hier.