我在 Math Stackexchange 中问过这个问题,但似乎没有得到足够的关注,所以我在这里问。https://math.stackexchange.com/questions/1729946/why-do-we-say-svd-can-handle-singular-matrx-when-doing-least-square-comparison?noredirect=1#comment3530971_1729946
我从一些教程中了解到,在解决最小二乘问题时,SVD 应该比 QR 分解更稳定,并且能够处理奇异矩阵。但是下面我用matlab写的例子似乎支持了相反的结论。我对 SVD 没有深入的了解,所以如果你能在 Math StackExchange 的旧帖子中查看我的问题并向我解释,我将不胜感激。
我使用具有大条件数(e+13)的矩阵。结果显示 SVD 得到的误差(0.8)比 QR(e-27)大得多
% we do a linear regression between Y and X
data= [
47.667483331 -122.1070832;
47.667483331001 -122.1070832
];
X = data(:,1);
Y = data(:,2);
X_1 = [ones(length(X),1),X];
%%
%SVD method
[U,D,V] = svd(X_1,'econ');
beta_svd = V*diag(1./diag(D))*U'*Y;
%% QR method(here one can also use "\" operator, which will get the same result as I tested. I just wrote down backward substitution to educate myself)
[Q,R] = qr(X_1)
%now do backward substitution
[nr nc] = size(R)
beta_qr=[]
Y_1 = Q'*Y
for i = nc:-1:1
s = Y_1(i)
for j = m:-1:i+1
s = s - R(i,j)*beta_qr(j)
end
beta_qr(i) = s/R(i,i)
end
svd_error = 0;
qr_error = 0;
for i=1:length(X)
svd_error = svd_error + (Y(i) - beta_svd(1) - beta_svd(2) * X(i))^2;
qr_error = qr_error + (Y(i) - beta_qr(1) - beta_qr(2) * X(i))^2;
end