0

我正在尝试在 pentaho 中执行 hadoop-mapreduce。我在工作中执行了hadoopcopyfiles步骤来指定文件的输入路径。如果我的输入文件位置具有根访问权限,则一切正常。(即)已经在根文件夹中创建的文件。但是,如果我将源文件作为我的本地文件位置,它会在 pentaho 日志中给出以下错误

2016/01/12 11:44:57 - Spoon - Starting job...
2016/01/12 11:44:57 - samplemapjob1 - Start of job execution
2016/01/12 11:44:57 - samplemapjob1 - Starting entry [Hadoop Copy Files]
2016/01/12 11:44:57 - Hadoop Copy Files - Starting ...
2016/01/12 11:44:57 - Hadoop Copy Files - Processing row source File/folder source : [file:///home/vasanth/Desktop/my.txt] ... destination file/folder : [hdfs://WEB2131:9000/new1/]... wildcard : [null]
2016/01/12 11:45:03 - Hadoop Copy Files - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : File System Exception: Could not find files in "file:///home/vasanth/Desktop".
2016/01/12 11:45:03 - Hadoop Copy Files - ERROR (version 6.0.0.0-353, build 1 from 2015-10-07 13.27.43 by buildguy) : Caused by: Invalid descendent file name "hdfs:".

我试过给

sudo chmod 777 /home/vasanth/Desktop/my.txt

但错误仍然存​​在。我该如何解决这个问题?

4

0 回答 0