2

python-snappy 似乎已安装 - Dask 返回 ValueError。

用于 jupyter 和 worker 的 Helm Config:

env:
  - name: EXTRA_CONDA_PACKAGES
    value: numba xarray s3fs python-snappy pyarrow ruamel.yaml -c conda-forge
  - name: EXTRA_PIP_PACKAGES
    value: dask-ml --upgrade

容器显示 python-snappy(通过 conda list)

数据框是从 Apache Drill 生成的多部分 parquet 文件中加载的:

files = ['s3://{}'.format(f) for f in fs.glob(path='{}/*.parquet'.format(filename))]
df = dd.read_parquet(files)

在数据帧上运行len(df)返回:

distributed.utils - ERROR - Data is compressed as snappy but we don't have this installed
Traceback (most recent call last):
  File "/opt/conda/lib/python3.6/site-packages/distributed/utils.py", line 622, in log_errors
    yield
  File "/opt/conda/lib/python3.6/site-packages/distributed/client.py", line 921, in _handle_report
    six.reraise(*clean_exception(**msg))
  File "/opt/conda/lib/python3.6/site-packages/six.py", line 692, in reraise
    raise value.with_traceback(tb)
  File "/opt/conda/lib/python3.6/site-packages/distributed/comm/tcp.py", line 203, in read
    msg = yield from_frames(frames, deserialize=self.deserialize)
  File "/opt/conda/lib/python3.6/site-packages/tornado/gen.py", line 1099, in run
    return
  File "/opt/conda/lib/python3.6/site-packages/tornado/gen.py", line 315, in wrapper
    future.set_result(_value_from_stopiteration(e))
  File "/opt/conda/lib/python3.6/site-packages/distributed/comm/utils.py", line 75, in from_frames
    res = _from_frames()
  File "/opt/conda/lib/python3.6/site-packages/distributed/comm/utils.py", line 61, in _from_frames
    return protocol.loads(frames, deserialize=deserialize)
  File "/opt/conda/lib/python3.6/site-packages/distributed/protocol/core.py", line 96, in loads
    msg = loads_msgpack(small_header, small_payload)
  File "/opt/conda/lib/python3.6/site-packages/distributed/protocol/core.py", line 171, in loads_msgpack
    " installed" % str(header['compression']))
ValueError: Data is compressed as snappy but we don't have this installed

任何人都可以在这里提出正确的配置或补救步骤吗?

4

1 回答 1

1

这个错误实际上不是来自读取您的 parquet 文件,而是来自 Dask 如何在机器之间压缩数据。您可以通过python-snappy在所有客户端/调度程序/工作 pod 上安装或不安装一致来解决此问题。

您应该执行以下任一操作:

  1. 从您的jupyterworkerpod 的 conda 包列表中删除 python-snappy。如果您正在使用pyarrow,那么这是不必要的,我相信 Arrow 在 C++ 级别包含 snappy。
  2. 添加python-snappy到您的schedulerpod

FWIW 我个人建议lz4snappy.

于 2018-05-15T01:30:45.127 回答