Type: | Package |
Title: | Kernel Fisher Discriminant Analysis |
Version: | 1.0.0 |
Date: | 2017-09-27 |
Author: | Donghwan Kim |
Maintainer: | Donghwan Kim <donhkim9714@korea.ac.kr> |
Description: | Kernel Fisher Discriminant Analysis (KFDA) is performed using Kernel Principal Component Analysis (KPCA) and Fisher Discriminant Analysis (FDA). There are some similar packages. First, 'lfda' is a package that performs Local Fisher Discriminant Analysis (LFDA) and performs other functions. In particular, 'lfda' seems to be impossible to test because it needs the label information of the data in the function argument. Also, the 'ks' package has a limited dimension, which makes it difficult to analyze properly. This package is a simple and practical package for KFDA based on the paper of Yang, J., Jin, Z., Yang, J. Y., Zhang, D., and Frangi, A. F. (2004) <doi:10.1016/j.patcog.2003.10.015>. |
License: | GPL-3 |
Encoding: | UTF-8 |
LazyData: | yes |
Repository: | CRAN |
URL: | https://github.com/ainsuotain/kfda |
Depends: | R (≥ 3.0.0), kernlab, MASS |
NeedsCompilation: | no |
Packaged: | 2017-09-27 00:52:55 UTC; David |
Date/Publication: | 2017-09-27 11:06:54 UTC |
Kernel Fisher Discriminant Analysis (KFDA)
Description
Train the trainData using KFDA. Basically, we run KFDA using Gaussian kernel. Returns trained KFDA object.
Usage
kfda(trainData = data, kernel.name = "rbfdot", kpar.sigma = 0.001, threshold = 1e-05)
Arguments
trainData |
an optional |
kernel.name |
the kernel function used in training and predicting. This parameter is fixed in the |
kpar.sigma |
hyper-parameter of selected kernel. |
threshold |
the value of the eigenvalue under which principal components are ignored (only valid when features = 0). (default : 1e-05). |
Details
Train the trainData using KFDA. Basically, we run KFDA using Gaussian kernel. Returns trained KFDA object.
Since this function performs KFDA with the appropriate combination of kpca
and lda
, the following values can show the result of each function.
Value
An object of class kfda
.
kpca.train |
An object of class "kpca". It has results of |
lda.rotation.train |
The result of applying LDA, After KPCA is performed on trainData. |
LDs |
A dataframe of linear discriminants of LDA. |
label |
A vector of class label of trainData. |
Note
This package is an early version and will be updated in the future.
Author(s)
Donghwan Kim
ainsuotain@hanmail.net
donhkim9714@korea.ac.kr
dhkim2@bistel.com
References
Yang, J., Jin, Z., Yang, J. Y., Zhang, D., and Frangi, A. F. (2004) <DOI:10.1016/j.patcog.2003.10.015>. Essence of kernel Fisher discriminant: KPCA plus LDA. Pattern Recognition, 37(10): 2097-2100.
See Also
kpca
(in package kernlab)
lda
(in package MASS)
kfda.predict
Examples
# data input
data(iris)
# data separation
idx <- sample(1:dim(iris)[1], round(dim(iris)[1]*0.7))
trainData <- iris[idx, ]
# training KFDA model
kfda.model <- kfda(trainData = trainData, kernel.name = "rbfdot")
# structure of kfda.model
str(kfda.model)
Predict Method for Kernel Fisher Discriminant Analysis (KFDA) fit
Description
Test the testData using KFDA. This function is used after training phase is performed using the kfda function.
Usage
kfda.predict(object = obj, testData = data)
Arguments
object |
An |
testData |
an optional |
Details
Since this function inherits KPCA
and LDA
, various learning can be possible by adjusting the hyper-parameters of each function.
Value
The result of performing testData on the KFDA model.
class |
A class label of testData. |
posterior |
A posterior probabilities for the classes. |
x |
The scores of testData on up to |
Author(s)
Donghwan Kim
ainsuotain@hanmail.net
donhkim9714@korea.ac.kr
dhkim2@bistel.com
References
Yang, J., Jin, Z., Yang, J. Y., Zhang, D., and Frangi, A. F. (2004) <DOI:10.1016/j.patcog.2003.10.015>. Essence of kernel Fisher discriminant: KPCA plus LDA. Pattern Recognition, 37(10): 2097-2100.
See Also
Examples
# data input
data(iris)
# data separation
idx <- sample(1:dim(iris)[1], round(dim(iris)[1]*0.7))
trainData <- iris[idx, ]
testData <- iris[-(idx), -dim(iris)[2]]
testData.Label <- iris[-(idx), dim(iris)[2]]
# training KFDA model
kfda.model <- kfda(trainData = trainData, kernel.name = "rbfdot")
# testing new(test)data by KFDA model
pre <- kfda.predict(object = kfda.model, testData = testData)
# plotting
plot(kfda.model$LDs, col = kfda.model$label, pch = 19, main = "Plot for KFDA")
points(pre$x, col = pre$class, cex = 2)
legend("topleft", legend = c("trainData","testData"), pch = c(19,1))
# prediction result
table(pre$class, (testData.Label))