Kernel Principal Component Analysis
Kernel Principal Component Analysis (KPCA):
There are a lot of machine learning problems which a non-linear and the use of nonlinear feature mapping can help to produce new features that make prediction problems linear. Transformation of the dataset to a new higher-dimensional feature space and the use of PCA in that space in order to produce uncorrelated features. Such a method is called Kernel Principal Component Analysis or KPCA.
Kernel PCA computes the principal eigenvectors of the kernel matrix, rather than those of the covariance matrix. The reformulation of PCA in kernel space is straightforward since a kernel matrix is similar to the product of the datapoints in the high-dimensional space that is constructed using the kernel function. The application of PCA in the kernel space provides Kernel PCA the property of constructing nonlinear mappings.