KLDA - An Iterative Approach to Fisher Discriminant Analysis

Date

2007

Authors

Lu, Fangfang
Li, Hongdong

Journal Title

Journal ISSN

Volume Title

Publisher

Institute of Electrical and Electronics Engineers (IEEE Inc)

Abstract

In this paper, we present an iterative approach to Fisher discriminant analysis called Kullback-Leibler discriminant analysis (KLDA) for both linear and nonlinear feature extraction. We pose the conventional problem of discriminative feature extraction into the setting of function optimization and recover the feature transformation matrix via maximization of the objective function. The proposed objective function is defined by pairwise distances between all pairs of classes and the Kullback-Leibler divergence is adopted to measure the disparity between the distributions of each pair of classes. Our proposed algorithm can be naturally extended to handle nonlinear data by exploiting the kernel trick. Experimental results on the real world databases demonstrate the effectiveness of both the linear and kernel versions of our algorithm.

Description

Keywords

Keywords: Feature transformations; Fisher discriminant analysis; Function optimization; International conferences; Iterative approaches; Kernel fisher discriminant analysis; Kernel tricks; Kullback-Leibler divergence; Linear discriminant analysis; Nonlinear data; N Kernel fisher discriminant analysis; Kullback-Leibler divergence; Linear discriminant analysis; Optimization

Citation

Source

Proceedings of the 2007 IEEE International Conference on Image Processing (ICIP-2007)

Type

Conference paper

Book Title

Entity type

Access Statement

License Rights

Restricted until

2037-12-31