Online Clothing Recommendation and Style Compatibility Learning Based on Joint Semantic Feature Fusion
TP399; Clothing plays an important role in humans' social life as it can enhance people's personal quality, and it is a practical problem by answering the question "which item should be chosen to match current fashion items in a set to form collocational and compatible outfits"....
Saved in:
Published in | 东华大学学报(英文版) Vol. 39; no. 4; pp. 325 - 331 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Engineering Research Center of Digitalized Textile&Fashion Technology,Donghua University,Shanghai 201620,China
30.08.2022
College of Information Science and Technology,Donghua University,Shanghai 201620,China%College of Information Science and Technology,Donghua University,Shanghai 201620,China |
Subjects | |
Online Access | Get full text |
ISSN | 1672-5220 |
DOI | 10.19884/j.1672-5220.202202345 |
Cover
Loading…
Summary: | TP399; Clothing plays an important role in humans' social life as it can enhance people's personal quality, and it is a practical problem by answering the question "which item should be chosen to match current fashion items in a set to form collocational and compatible outfits". Motivated by this target an end-to-end clothing collocation learning framework is developed for handling the above task. In detail, the proposed framework firstly conducts feature extraction by fusing the features of deep layer from Inception-V3 and classification branch of mask regional convolutional neural network (Mask-RCNN), respectively, so that the low-level texture information and high-level semantic information can be both preserved. Then, the proposed framework treats the collocation outfits as a set of sequences and adopts bidirectional long short-term memory (Bi-LSTM) for the prediction. Extensive simulations are conducted based on DeepFashion2 datasets. Simulation results verify the effectiveness of the proposed method compared with other state-of-the-art clothing collocation methods. |
---|---|
ISSN: | 1672-5220 |
DOI: | 10.19884/j.1672-5220.202202345 |