我正在使用 scikit learn 的“光谱聚类”功能。我能够对 8100 x 8100 矩阵执行聚类,但此函数会为 10000 x 10000 矩阵引发错误。
有没有人将此功能用于大型矩阵?
编辑:我收到以下错误消息:
Not enough memory to perform factorization.
Traceback (most recent call last):
File "combined_code_img.py", line 287, in <module>
labels=spectral.fit_predict(Affinity)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/base.py",
line 410, in fit_predict
self.fit(X)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/cluster/spectral.py", line 463, in fit
assign_labels=self.assign_labels)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/cluster/spectral.py", line 258, in spectral_clustering
eigen_tol=eigen_tol, drop_first=False)
File "/root/anaconda/lib/python2.7/site-packages/sklearn/manifold/spectral_embedding_.py", line 265, in spectral_embedding
tol=eigen_tol, v0=v0)
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 1560, in eigsh
symmetric=True, tol=tol)
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 1046, in get_OPinv_matvec
return SpLuInv(A.tocsc()).matvec
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/eigen/arpack/arpack.py", line 907, in __init__
self.M_lu = splu(M)
File "/root/anaconda/lib/python2.7/site-packages/scipy/sparse/linalg/dsolve/linsolve.py", line 261, in splu
ilu=False, options=_options)
MemoryError
我的机器有 16 GB 内存。