Academic journal article Journal of Digital Information Management

Tensor Graph-Optimized Linear Discriminant Analysis

Academic journal article Journal of Digital Information Management

Tensor Graph-Optimized Linear Discriminant Analysis

Article excerpt

1. Introduction

High-dimensional feature of data not only increase the computational complexity and contain plenty of redundant information, therefore dimensionality reduction is essential the processing step for data mining. The destination of dimensionality reduction is to map high-dimensional data into high-dimensional data without losing certain feature as little as possible. Linear Discriminant Analysis (LDA) [1] or Fisher Discriminant Analysis (FDA) [2] is common dimensionality reduction method. LDA or FDA attempts to preserve the separability within classes as much as possible and has good discriminant performance. There are more and more attentions on LDA or FDA from researchers. Representative algorithms include Pseudo-inverse Linear Discriminant Analysis (PLDA) [3], regular Linear Discriminant Analysis (RLDA) [4], Penalized Discriminant Analysis (PDA) [5], LDA/GSVD [6], LDA/ QR [7], Orthogonal Linear Discriminant Analysis (OLDA) [8], Null Space Linear Discriminant Analysis (NLDA) [9], Direct Linear Discriminant Analysis (DLDA) [10], Nonparametric Discriminant Analysis(NDA) [11], Local Fisher Discriminant Analysis (LFDA) [12], Multi-label Linear Discriminant Analysis (MLDA) [13] and Local Linear Discriminant Analysis (LLDA) [14]. However the effectiveness of these algorithms is still limited because the number of the available projection directions is lower than the class number. Moreover, these algorithms are proposed based on the data approximately obeying a Gaussian distribution, which cannot always be satisfied in the real-world applications. Inspired by the success of the graph-based embedding dimensionality reduction techniques, Cai et al [15] proposes a novel supervised dimensionality reduction algorithm, called the Graph-based Fisher Analysis (GbFA). GbFA don't need to obey a Gaussian distribution. GbFA redefined The intrinsic graph based on the same-class samples and the penalty graph based on the not-same-class samples. GbFA will make the original neighbor same-class samples much closer in the output space while pushing apart the original neighbor not-same-class samples in the output space, so GbFA encodes the discriminating information. However, GbFA need transform two or more dimensional feature matrix into feature vector, which leads to loss of spatial relation on pixels in face images. So researchers proposed tensor versions of LDA. He et al [16] proposed Tensor Linear Discriminative Analysis (TLDA). On the basic of LFDA, Zhao et al [17] proposed Tensor Locally Linear Discriminative Analysis, TLLDA).

Inspired by above analyses, a dimensionality reduction algorithm called Tensor Graph-based Linear Discriminant Analysis (TGbLDA) is proposed in the paper. TGbLDA regards two-dimensional face image as second-order tensor data and get two projections through the iteration loop with TGbLDA. Projected data not only preserve graph-based discriminant information but also preserve the spatial relations of pixels in face images. Experiment on Yale and YaleB demonstrate that our algorithm is efficient.

2. Graph-based Fisher Analysis (GbFA)

The objective function of GbFA is gotten as follows:

(1) Firstly, according to the theory of graph optimization, GbFA construct an intrinsic graph [G.sup.c] = {X, [W.sup.c]} and a penalty graph [G.sup.p] = {X, [W.sup.p]} as follows

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (1)

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (2)

where t [member of] R, [W.sup.c.sub.ij] indicates the importance degree of [x.sub.i] and [x.sub.j] in the same class. [W.sup.c.sub.ij] indicates the importance degree of [x.sub.i] and [x.sub.j] in the not same class.

(2) Secondly, with the embedding map T, the intra-class compactness can be characterized from the intrinsic graph, GbFA defines the square of the norm in the form of the matrix trace as follows:

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (3)

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (4)

Equation (3) can be further transformed into

[MATHEMATICAL EXPRESSION NOT REPRODUCIBLE IN ASCII] (5)

where [D. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.