Low-level representation of visual features

Author

Francisco Estrada

Abstract

The analysis and representation of visual information present at specific locations in the image is a fundamental task in image analysis. Having a suitable representation for visual features enables reasoning with the end goal of helping object categorization, image region classification, and image segmentation among others.


This work aims at exploring different types of representations for visual features, as well as their usefulness for image segmentation and object classification. Features are sampled over an input image, either uniformly or as a result of a process for finding interesting image locations. At this point low level saliency can be invoked to determine which parts of the image are most likely to contain salient objects. Once local features have been determined, the analysis of visual content is performed.


Colour and texture features are extracted, these features use state of the art techniques in the respective area of image processing. The features are then used to produce a segmentation of the image using a custom-made region growing/merging algorithm. Once a segmentation of the image is available, a classifier determines which of a small list of known keywords corresponds to each of the regions. The classification process is also based on colour/texture analysis. The end result is a list of keywords that describe the content of the different regions of the image. Throughout the process, the influence of different types of colour/texture features, colour space selection, and type of classification scheme are studied. This work build on earlier classification research which is currently part of the PHAROS content-based multimedia search framework.


Original image
Original image
Resulting segmentation
Resulting segmentation

Collaborations

Sabine Süsstrunk
Pascal Fua
Vincent Lepetit
Radhakrishna Achanta

Funding

This work is supported by the EU IC-PHAROS project, funding start date Jan. 2007, ongoing. Please visit the PHAROS project main page for more information.