我有以下要移植到 F#的稀疏过滤 MATLAB 代码。我知道 MATLAB 的 F# Type Provider 但不能在这里使用它,因为它会创建对 MATLAB 的依赖(我可以用它来测试)
function [optW] = SparseFiltering(N, X);
% N = # features to learn, X = input data (examples in column)
% You should pre-process X by removing the DC component per example,
% before calling this function.
% e.g., X = bsxfun(@minus, X, mean(X));
addpath minFunc/ % Add path to minFunc optimization package
optW = randn(N, size(X, 1));
optW = minFunc(@SparseFilteringObj, optW(:), struct('MaxIter', 100), X, N);
optW = reshape(optW, [N, size(X, 1)]);
end
function [Obj, DeltaW] = SparseFilteringObj (W, X, N)
% Reshape W into matrix form
W = reshape(W, [N, size(X,1)]);
% Feed Forward
F = W*X; % Linear Activation
Fs = sqrt(F.ˆ2 + 1e-8); % Soft-Absolute Activation
[NFs, L2Fs] = l2row(Fs); % Normalize by Rows
[Fhat, L2Fn] = l2row(NFs'); % Normalize by Columns
% Compute Objective Function
Obj = sum(sum(Fhat, 2), 1);
% Backprop through each feedforward step
DeltaW = l2grad(NFs', Fhat, L2Fn, ones(size(Fhat)));
DeltaW = l2grad(Fs, NFs, L2Fs, DeltaW');
DeltaW = (DeltaW .* (F ./ Fs)) * X';
DeltaW = DeltaW(:);
end
function [Y,N] = l2row(X) % L2 Normalize X by rows
% We also use this to normalize by column with l2row(X')
N = sqrt(sum(X.ˆ2,2) + 1e-8);
Y = bsxfun(@rdivide,X,N);
end
function [G] = l2grad(X,Y,N,D) % Backpropagate through Normalization
G = bsxfun(@rdivide, D, N) - bsxfun(@times, Y, sum(D.*X, 2) ./ (N.ˆ2));
end
我了解大部分 MATLAB 代码,但我不确定minFunc
.Net 中 MATLAB 的等价物是什么。我相信我想要其中之一Microsoft.SolverFoundation.Solvers
。根据 MATLAB 的网站
... minFunc 的默认参数调用准牛顿策略,其中使用 Shanno-Phua 缩放的有限内存 BFGS 更新来计算步长方向,并使用包围线搜索满足强 Wolfe 条件的点计算步长方向。在线搜索中,(受保护的)三次插值用于生成试验值,并且该方法在目标函数进入参数不产生实值输出的区域的迭代中切换到 Armijo 回溯线搜索
鉴于上述信息,任何人都可以确认Microsoft.SolverFoundation.Solvers.CompactQuasiNewtonModel是正确的方法吗?
此外,将上述代码移植到 F# 时是否还有其他明显的“陷阱”?(此类端口的新手)