DeepProp: Extracting Deep Features from a Single Image for Edit Propagation

Edit propagation is a technique that can propagate various image edits (e.g., colorization and recoloring) performed via user strokes to the entire image based on similarity of image features. In most previous work, users must manually determine the importance of each image feature (e.g., color, coo...

Full description

Saved in:
Bibliographic Details
Published inComputer graphics forum Vol. 35; no. 2; pp. 189 - 201
Main Authors Endo, Yuki, Iizuka, Satoshi, Kanamori, Yoshihiro, Mitani, Jun
Format Journal Article
LanguageEnglish
Published Oxford Blackwell Publishing Ltd 01.05.2016
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Edit propagation is a technique that can propagate various image edits (e.g., colorization and recoloring) performed via user strokes to the entire image based on similarity of image features. In most previous work, users must manually determine the importance of each image feature (e.g., color, coordinates, and textures) in accordance with their needs and target images. We focus on representation learning that automatically learns feature representations only from user strokes in a single image instead of tuning existing features manually. To this end, this paper proposes an edit propagation method using a deep neural network (DNN). Our DNN, which consists of several layers such as convolutional layers and a feature combiner, extracts stroke‐adapted visual features and spatial features, and then adjusts the importance of them. We also develop a learning algorithm for our DNN that does not suffer from the vanishing gradient problem, and hence avoids falling into undesirable locally optimal solutions. We demonstrate that edit propagation with deep features, without manual feature tuning, can achieve better results than previous work.
Bibliography:istex:3743B76776C9C1E51D8133FD261B1B78A45738C9
ark:/67375/WNG-J63WNXLB-9
ArticleID:CGF12822
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ISSN:0167-7055
1467-8659
DOI:10.1111/cgf.12822