0

我们正在开展一个项目,并试图通过 KPCA 获得一些结果。

我们有一个数据集(手写数字)并取了每个数字的前 200 个数字,所以我们完整的训练数据矩阵是 2000x784(784 是维度)。当我们做 KPCA 时,我们得到一个带有新的低维数据集 eg2000x100 的矩阵。但是我们不明白结果。我们不应该得到其他矩阵,比如我们为 pca 做 svd 时所做的吗?我们用于 KPCA 的代码如下:

function data_out = kernelpca(data_in,num_dim)

%% Checking to ensure output dimensions are lesser than input dimension.
if num_dim > size(data_in,1)
    fprintf('\nDimensions of output data has to be lesser than the dimensions of input data\n');
    fprintf('Closing program\n');
    return 
end

%% Using the Gaussian Kernel to construct the Kernel K
% K(x,y) = -exp((x-y)^2/(sigma)^2)
% K is a symmetric Kernel
K = zeros(size(data_in,2),size(data_in,2));
for row = 1:size(data_in,2)
    for col = 1:row
        temp = sum(((data_in(:,row) - data_in(:,col)).^2));
        K(row,col) = exp(-temp); % sigma = 1
    end
end
K = K + K'; 
% Dividing the diagonal element by 2 since it has been added to itself
for row = 1:size(data_in,2)
    K(row,row) = K(row,row)/2;
end
% We know that for PCA the data has to be centered. Even if the input data
% set 'X' lets say in centered, there is no gurantee the data when mapped
% in the feature space [phi(x)] is also centered. Since we actually never
% work in the feature space we cannot center the data. To include this
% correction a pseudo centering is done using the Kernel.
one_mat = ones(size(K));
K_center = K - one_mat*K - K*one_mat + one_mat*K*one_mat;
clear K

%% Obtaining the low dimensional projection
% The following equation needs to be satisfied for K
% N*lamda*K*alpha = K*alpha
% Thus lamda's has to be normalized by the number of points
opts.issym=1;                          
opts.disp = 0; 
opts.isreal = 1;
neigs = 30;
[eigvec eigval] = eigs(K_center,[],neigs,'lm',opts);
eig_val = eigval ~= 0;
eig_val = eig_val./size(data_in,2);
% Again 1 = lamda*(alpha.alpha)
% Here '.' indicated dot product
for col = 1:size(eigvec,2)
    eigvec(:,col) = eigvec(:,col)./(sqrt(eig_val(col,col)));
end
[~, index] = sort(eig_val,'descend');
eigvec = eigvec(:,index);

%% Projecting the data in lower dimensions
data_out = zeros(num_dim,size(data_in,2));
for count = 1:num_dim
    data_out(count,:) = eigvec(:,count)'*K_center';
end

我们已经阅读了很多论文,但仍然无法掌握 kpca 的逻辑!

任何帮助,将不胜感激!

4

1 回答 1

3

PCA 算法


  1. PCA 数据样本

  2. 计算平均值

  3. 计算协方差

  4. 解决

:协方差矩阵。 :协方差矩阵的特征向量。 :协方差矩阵的特征值。

使用前 n 个特征向量,您可以将数据的维数减少到 n 维。您可以将此代码用于 PCA,它有一个集成的示例,而且很简单。


KPCA 算法


我们在您的代码中选择一个内核函数,它由以下内容指定:

K(x,y) = -exp((x-y)^2/(sigma)^2)

为了在高维空间跳跃中表示您的数据,在这个空间中,您的数据将被很好地表示为进一步的用途,如分类或聚类,而这个任务在初始特征空间中可能更难解决。这个技巧也被称为“内核技巧”。看图。

在此处输入图像描述

[ Step1 ] 构造 gram 矩阵

K = zeros(size(data_in,2),size(data_in,2));
for row = 1:size(data_in,2)
    for col = 1:row
        temp = sum(((data_in(:,row) - data_in(:,col)).^2));
        K(row,col) = exp(-temp); % sigma = 1
    end
end
K = K + K'; 
% Dividing the diagonal element by 2 since it has been added to itself
for row = 1:size(data_in,2)
    K(row,row) = K(row,row)/2;
end

在这里,因为 gram 矩阵是对称的,所以计算了一半的值,并且通过添加计算出的到目前为止的 gram 矩阵及其转置来获得最终结果。最后,正如评论中提到的,我们除以 2。

[ Step2 ]归一化核矩阵

这是通过这部分代码完成的:

K_center = K - one_mat*K - K*one_mat + one_mat*K*one_mat;

正如评论中提到的,必须进行伪定心程序。关于这里的证明的一个想法。

[ Step3 ]求解特征值问题

For this task this part of the code is responsible.

%% Obtaining the low dimensional projection
% The following equation needs to be satisfied for K
% N*lamda*K*alpha = K*alpha
% Thus lamda's has to be normalized by the number of points
opts.issym=1;                          
opts.disp = 0; 
opts.isreal = 1;
neigs = 30;
[eigvec eigval] = eigs(K_center,[],neigs,'lm',opts);
eig_val = eigval ~= 0;
eig_val = eig_val./size(data_in,2);
% Again 1 = lamda*(alpha.alpha)
% Here '.' indicated dot product
for col = 1:size(eigvec,2)
    eigvec(:,col) = eigvec(:,col)./(sqrt(eig_val(col,col)));
end
[~, index] = sort(eig_val,'descend');
eigvec = eigvec(:,index);

[ Step4 ]改变每个数据点的表示

对于这个任务,这部分代码负责。

%% Projecting the data in lower dimensions
data_out = zeros(num_dim,size(data_in,2));
for count = 1:num_dim
    data_out(count,:) = eigvec(:,count)'*K_center';
end

看看这里的细节。

PS:我鼓励您使用该作者编写的代码并包含直观的示例。

于 2014-01-18T19:10:29.720 回答