我正在尝试编写用于将文件复制到 hdfs 的 python 脚本。我正在使用 ubuntu 并安装了 hadoop 和 pydoop。以下代码是我的脚本:
import pydoop.hdfs as hdfs
class COPYTOHDFS():
local_path = '/home/user/test.txt'
hdfs_path = '/testfile'
host = 'master'
port = 9000
hdfsobj = hdfs.hdfs(host, port, user='cloudera-user', groups=['supergroup'])
hdfsobj.copy(local_path, hdfsobj, hdfs_path)
错误在这里:
Traceback (most recent call last):
File "COPYTOHDFS.py", line 3, in <module>
class COPYTOHDFS():
File "COPYTOHDFS.py", line 10, in COPYTOHDFS
hdfsobj.copy(local_path, hdfsobj, hdfs_path)
File "/usr/local/lib/python2.7/dist-packages/pydoop-0.5.2_rc2-py2.7-linux-x86_64.egg/pydoop/hdfs.py", line 458, in copy
return super(hdfs, self).copy(from_path, to_hdfs, to_path)
IOError: Cannot copy /home/user/test.txt to filesystem on master
错误没有详细说明。任何的想法?