Localized information is necessary for scene categorization, including the Natural/Man-made distinction

What information do people use to categorize scenes? Computational scene classification models have proposed that unlocalized amplitude information, the distribution of spatial frequencies and orientations, is useful for categorizing scenes. Previous research has provided conflicting results regardi...

Full description

Saved in:
Bibliographic Details
Published inJournal of vision (Charlottesville, Va.) Vol. 8; no. 1; pp. 4.1 - 4.9
Main Authors Loschky, Lester C, Larson, Adam M
Format Journal Article
LanguageEnglish
Published United States 11.01.2008
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:What information do people use to categorize scenes? Computational scene classification models have proposed that unlocalized amplitude information, the distribution of spatial frequencies and orientations, is useful for categorizing scenes. Previous research has provided conflicting results regarding this claim. Our previous research (Loschky et al., 2007) has shown that randomly localizing amplitude information (i.e., randomizing phase) greatly disrupts scene categorization at the basic level. Conversely, studies suggesting the usefulness of unlocalized amplitude information have used binary distinctions, e.g., Natural/Man-made. We hypothesized that unlocalized amplitude information contributes more to the Natural/Man-made distinction than basic level distinctions. Using an established set of images and categories, we varied phase randomization and measured participants' ability to distinguish Natural versus Man-made scenes or scenes at the basic level. Results showed that eliminating localized information by phase randomization disrupted scene classification even for the Natural/Man-made distinction, demonstrating that amplitude localization is necessary for scene categorization.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-News-1
ObjectType-Feature-3
content type line 23
ISSN:1534-7362
1534-7362
DOI:10.1167/8.1.4