Prediction gradients for feature extraction and analysis from convolutional neural networks
Despite their impact on computer vision and face recognition, the inner workings of deep convolutional neural networks (CNNs) have traditionally been regarded as uninterpretable. We demonstrate this to be false by proposing prediction gradients to understand how neural networks encode concepts into...
Saved in:
Published in | 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) Vol. 1; pp. 1 - 6 |
---|---|
Main Authors | , , |
Format | Conference Proceeding |
Language | English |
Published |
IEEE
01.05.2015
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Despite their impact on computer vision and face recognition, the inner workings of deep convolutional neural networks (CNNs) have traditionally been regarded as uninterpretable. We demonstrate this to be false by proposing prediction gradients to understand how neural networks encode concepts into individual units. In constrast, existing efforts to understand convolutional nets focus on visualizing units and classes in pixel space, often using optimization. Our method for calculating prediction gradients is very efficient, and provides an effective technique to rank and quantify importance of internal units and their learned features based on the unit's relevance to any prediction. We use prediction gradients to analyse the features learned by a CNN on a standard face recognition data set. Our analysis identifies strong patterns of activation which are unique for each identity. In addition, we validate the rating produced by prediction gradients to remove the most important features of the network, knocking out their respective units in the network, and demonstrating detrimental effects on network prediction. Our experiments validate the utility of the prediction gradient in understanding the importance and relationships between units inside a convolutional neural network. |
---|---|
DOI: | 10.1109/FG.2015.7163154 |