2

我正在尝试编写用于将文件复制到 hdfs 的 python 脚本。我正在使用 ubuntu 并安装了 hadoop 和 pydoop。以下代码是我的脚本:

import pydoop.hdfs as hdfs

class COPYTOHDFS():

    local_path = '/home/user/test.txt'
    hdfs_path = '/testfile'
    host = 'master'
    port = 9000
    hdfsobj = hdfs.hdfs(host, port, user='cloudera-user', groups=['supergroup'])
    hdfsobj.copy(local_path, hdfsobj, hdfs_path)

错误在这里:

Traceback (most recent call last):
  File "COPYTOHDFS.py", line 3, in <module>
    class COPYTOHDFS():
  File "COPYTOHDFS.py", line 10, in COPYTOHDFS
    hdfsobj.copy(local_path, hdfsobj, hdfs_path)
  File "/usr/local/lib/python2.7/dist-packages/pydoop-0.5.2_rc2-py2.7-linux-x86_64.egg/pydoop/hdfs.py", line 458, in copy
    return super(hdfs, self).copy(from_path, to_hdfs, to_path)
IOError: Cannot copy /home/user/test.txt to filesystem on master

错误没有详细说明。任何的想法?

4

1 回答 1

2

在您的 conf/core-site.xml 中,您将为 fs 操作设置 tmp 目录。如果您忘记在这些目录上设置正在运行的用户的所有权和权限,那么就会出现 IO 异常,请检查。

于 2012-04-30T10:54:59.463 回答