我有一个文本分类数据集,我在其中使用 dask parquet 来节省磁盘空间,但是当我想将数据集拆分为训练并使用dask_ml.model_selection.train_test_split
.
ddf = dd.read_parquet('/storage/data/cleaned')
y = ddf['category'].values
X = ddf.drop('category', axis=1).values
train, test = train_test_split(X, y, test_size=0.2)
导致
TypeError: Cannot operate on Dask array with unknown chunk sizes.
谢谢您的帮助。