Power Normalizations in Fine-grained Image, Few-shot Image and Graph Classification
Loading...
Date
Authors
Koniusz, Piotr
Zhang, Hongguang
Journal Title
Journal ISSN
Volume Title
Publisher
Institute of Electrical and Electronics Engineers (IEEE Inc)
Abstract
Power Normalizations (PN) are useful non-linear operators which tackle feature imbalances in classification problems. We study PNs in deep learning via a novel PN layer that combines the feature vectors and their respective spatial locations in the feature maps produced by the last convolutional layer of CNN into a positive definite matrix with second-order statistics to which PN operators are applied, forming so-called Second-order Pooling (SOP). As the main goal of this paper is to study Power Normalizations, we investigate the role and meaning of MaxExp and Gamma, two popular PN functions. To this end, we provide probabilistic interpretations of such element-wise operators and discover surrogates with well-behaved derivatives for end-to-end training. Furthermore, we look at the spectral applicability of MaxExp and Gamma by studying Spectral Power Normalizations (SPN). We show that SPN on the autocorrelation/covariance matrix and the Heat Diffusion Process (HDP) on a graph Laplacian matrix are closely related, thus sharing their properties. Such a finding leads us to the culmination of our work, a fast spectral MaxExp which is a variant of HDP for covariances/autocorrelation matrices. We evaluate our ideas on fine-grained recognition, scene recognition, and material classification, as well as in few-shot learning and graph classification.
Description
Citation
Collections
Source
IEEE Transactions on Pattern Analysis and Machine Intelligence
Type
Book Title
Entity type
Access Statement
License Rights
Restricted until
2099-12-31