Kernel Fisher Discriminant Analysis

In statistics, kernel Fisher discriminant analysis (KFD), also known as generalized discriminant analysis and kernel discriminant analysis, is a kernelized version of linear discriminant analysis (LDA).

It is named after Ronald Fisher.

Linear discriminant analysis

Intuitively, the idea of LDA is to find a projection where class separation is maximized. Given two sets of labeled data, Kernel Fisher Discriminant Analysis  and Kernel Fisher Discriminant Analysis , we can calculate the mean value of each class, Kernel Fisher Discriminant Analysis  and Kernel Fisher Discriminant Analysis , as

    Kernel Fisher Discriminant Analysis 

where Kernel Fisher Discriminant Analysis  is the number of examples of class Kernel Fisher Discriminant Analysis . The goal of linear discriminant analysis is to give a large separation of the class means while also keeping the in-class variance small. This is formulated as maximizing, with respect to Kernel Fisher Discriminant Analysis , the following ratio:

    Kernel Fisher Discriminant Analysis 

where Kernel Fisher Discriminant Analysis  is the between-class covariance matrix and Kernel Fisher Discriminant Analysis  is the total within-class covariance matrix:

    Kernel Fisher Discriminant Analysis 

The maximum of the above ratio is attained at

    Kernel Fisher Discriminant Analysis 

as can be shown by the Lagrange multiplier method (sketch of proof):

Maximizing Kernel Fisher Discriminant Analysis  is equivalent to maximizing

    Kernel Fisher Discriminant Analysis 

subject to

    Kernel Fisher Discriminant Analysis 

This, in turn, is equivalent to maximizing Kernel Fisher Discriminant Analysis , where Kernel Fisher Discriminant Analysis  is the Lagrange multiplier.

At the maximum, the derivatives of Kernel Fisher Discriminant Analysis  with respect to Kernel Fisher Discriminant Analysis  and Kernel Fisher Discriminant Analysis  must be zero. Taking Kernel Fisher Discriminant Analysis  yields

    Kernel Fisher Discriminant Analysis 

which is trivially satisfied by Kernel Fisher Discriminant Analysis  and Kernel Fisher Discriminant Analysis 

Extending LDA

To extend LDA to non-linear mappings, the data, given as the Kernel Fisher Discriminant Analysis  points Kernel Fisher Discriminant Analysis  can be mapped to a new feature space, Kernel Fisher Discriminant Analysis  via some function Kernel Fisher Discriminant Analysis  In this new feature space, the function that needs to be maximized is

    Kernel Fisher Discriminant Analysis 

where

    Kernel Fisher Discriminant Analysis 

and

    Kernel Fisher Discriminant Analysis 

Further, note that Kernel Fisher Discriminant Analysis . Explicitly computing the mappings Kernel Fisher Discriminant Analysis  and then performing LDA can be computationally expensive, and in many cases intractable. For example, Kernel Fisher Discriminant Analysis  may be infinitely dimensional. Thus, rather than explicitly mapping the data to Kernel Fisher Discriminant Analysis , the data can be implicitly embedded by rewriting the algorithm in terms of dot products and using kernel functions in which the dot product in the new feature space is replaced by a kernel function,Kernel Fisher Discriminant Analysis .

LDA can be reformulated in terms of dot products by first noting that Kernel Fisher Discriminant Analysis  will have an expansion of the form

    Kernel Fisher Discriminant Analysis 

Then note that

    Kernel Fisher Discriminant Analysis 

where

    Kernel Fisher Discriminant Analysis 

The numerator of Kernel Fisher Discriminant Analysis  can then be written as:

    Kernel Fisher Discriminant Analysis 

Similarly, the denominator can be written as

    Kernel Fisher Discriminant Analysis 

with the Kernel Fisher Discriminant Analysis  component of Kernel Fisher Discriminant Analysis  defined as Kernel Fisher Discriminant Analysis  is the identity matrix, and Kernel Fisher Discriminant Analysis  the matrix with all entries equal to Kernel Fisher Discriminant Analysis . This identity can be derived by starting out with the expression for Kernel Fisher Discriminant Analysis  and using the expansion of Kernel Fisher Discriminant Analysis  and the definitions of Kernel Fisher Discriminant Analysis  and Kernel Fisher Discriminant Analysis 

    Kernel Fisher Discriminant Analysis 

With these equations for the numerator and denominator of Kernel Fisher Discriminant Analysis , the equation for Kernel Fisher Discriminant Analysis  can be rewritten as

    Kernel Fisher Discriminant Analysis 

Then, differentiating and setting equal to zero gives

    Kernel Fisher Discriminant Analysis 

Since only the direction of Kernel Fisher Discriminant Analysis , and hence the direction of Kernel Fisher Discriminant Analysis  matters, the above can be solved for Kernel Fisher Discriminant Analysis  as

    Kernel Fisher Discriminant Analysis 

Note that in practice, Kernel Fisher Discriminant Analysis  is usually singular and so a multiple of the identity is added to it

    Kernel Fisher Discriminant Analysis 

Given the solution for Kernel Fisher Discriminant Analysis , the projection of a new data point is given by

    Kernel Fisher Discriminant Analysis 

Multi-class KFD

The extension to cases where there are more than two classes is relatively straightforward. Let Kernel Fisher Discriminant Analysis  be the number of classes. Then multi-class KFD involves projecting the data into a Kernel Fisher Discriminant Analysis -dimensional space using Kernel Fisher Discriminant Analysis  discriminant functions

    Kernel Fisher Discriminant Analysis 

This can be written in matrix notation

    Kernel Fisher Discriminant Analysis 

where the Kernel Fisher Discriminant Analysis  are the columns of Kernel Fisher Discriminant Analysis . Further, the between-class covariance matrix is now

    Kernel Fisher Discriminant Analysis 

where Kernel Fisher Discriminant Analysis  is the mean of all the data in the new feature space. The within-class covariance matrix is

    Kernel Fisher Discriminant Analysis 

The solution is now obtained by maximizing

    Kernel Fisher Discriminant Analysis 

The kernel trick can again be used and the goal of multi-class KFD becomes

    Kernel Fisher Discriminant Analysis 

where Kernel Fisher Discriminant Analysis  and

    Kernel Fisher Discriminant Analysis 

The Kernel Fisher Discriminant Analysis  are defined as in the above section and Kernel Fisher Discriminant Analysis  is defined as

    Kernel Fisher Discriminant Analysis 

Kernel Fisher Discriminant Analysis  can then be computed by finding the Kernel Fisher Discriminant Analysis  leading eigenvectors of Kernel Fisher Discriminant Analysis . Furthermore, the projection of a new input, Kernel Fisher Discriminant Analysis , is given by

    Kernel Fisher Discriminant Analysis 

where the Kernel Fisher Discriminant Analysis  component of Kernel Fisher Discriminant Analysis  is given by Kernel Fisher Discriminant Analysis .

Classification using KFD

In both two-class and multi-class KFD, the class label of a new input can be assigned as

    Kernel Fisher Discriminant Analysis 

where Kernel Fisher Discriminant Analysis  is the projected mean for class Kernel Fisher Discriminant Analysis  and Kernel Fisher Discriminant Analysis  is a distance function.

Applications

Kernel discriminant analysis has been used in a variety of applications. These include:

  • Face recognition and detection
  • Hand-written digit recognition
  • Palmprint recognition
  • Classification of malignant and benign cluster microcalcifications
  • Seed classification
  • Search for the Higgs Boson at CERN

See also

References

Tags:

Kernel Fisher Discriminant Analysis Linear discriminant analysisKernel Fisher Discriminant Analysis Extending LDAKernel Fisher Discriminant Analysis Multi-class KFDKernel Fisher Discriminant Analysis Classification using KFDKernel Fisher Discriminant Analysis ApplicationsKernel Fisher Discriminant AnalysisLinear discriminant analysisRonald FisherStatistics

🔥 Trending searches on Wiki English:

James MarsdenMS DhoniX (2022 film)Alexander SkarsgårdJohn TravoltaRoberto De ZerbiAlex BorsteinSouth KoreaWednesday (TV series)Edward VIIIPatrick SwayzeKieran CulkinThe Hunger GamesMark SelbySalma HayekKundavai PirāttiyārEden GardensNight of Champions (2023)Dasara (film)Ansel AdamsChristian BaleMani RatnamMalik MonkRachel BrosnahanMichelle PfeifferAmazon (company)Ever AndersonFord v FerrariTom HollandYellowstone (American TV series)The Last of Us (TV series)Dylan MulvaneyZoe SaldañaGreg DaviesRyan MasonSexKelee RingoThe Pope's ExorcistRachel McAdamsEric StonestreetThe Green Mile (film)PakistanJohn Michael SchmitzThe Ballad of Songbirds and Snakes2023 NBA playoffsAmerican Civil WarThe Hunger Games (film series)Donald Trump2023 Stanley Cup playoffs2023 Mutua Madrid Open – Men's singlesMarisa TomeiDrew BarrymoreXXXPhilippinesGoogle MapsAl Nassr FCNathaniel Dell2023 in filmElizabeth IIList of American films of 2023Dwyane WadeDorothy StrattenTamerlan TsarnaevMackenzie PhillipsJennifer ConnellyKevin DurantDaniel Day-LewisSisu (film)Malik WillisOlivia RodrigoWorld Chess Championship 2023GoogleMelissa BarreraSuccession (TV series)Ari AsterGillian McKeithAnthony Davis🡆 More