Matching objects across the textured-smooth continuum

Oggie Arandelovic*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The problem of 3D object recognition is of immense practical importance, with the last decade witnessing a number of breakthroughs in the state of the art. Most of the previous work has focused on the matching of textured objects using local appearance descriptors extracted around salient image points. The recently proposed bag of boundaries method was the first to address directly the problem of matching smooth objects using boundary features. However, no previous work has attempted to achieve a holistic treatment of the problem by jointly using textural and shape features which is what we describe herein. Due to the complementarity of the two modalities, we fuse the corresponding matching scores and learn their relative weighting in a data specific manner by optimizing discriminative performance on synthetically distorted data. For the textural description of an object we adopt a representation in the form of a histogram of SIFT based visual words. Similarly the apparent shape of an object is represented by a histogram of discretized features capturing local shape. On a large public database of a diverse set of objects, the proposed method is shown to outperform significantly both purely textural and purely shape based approaches for matching across viewpoint variation.

Original languageEnglish
Title of host publicationAustralasian Conference on Robotics and Automation, ACRA
Publication statusPublished - 2012
Event2012 Australasian Conference on Robotics and Automation, ACRA 2012 - Wellington, New Zealand
Duration: 3 Dec 20125 Dec 2012

Conference

Conference2012 Australasian Conference on Robotics and Automation, ACRA 2012
Country/TerritoryNew Zealand
CityWellington
Period3/12/125/12/12

Fingerprint

Dive into the research topics of 'Matching objects across the textured-smooth continuum'. Together they form a unique fingerprint.

Cite this