2

我目前已经开始使用 karmasphere eclipse 插件来做 mapreduce 工作。我按照文档中的说明操作,可以在主机上运行本地开发、部署作业。后来我下载了Cloudera CDH3并作为VM运行(通过VMWare),我可以在VM(来宾机器)本地运行mapreduce作业,我可以从eclipse Hadoop的角度监控VM中发生的mapreduce作业(主机),当我尝试使用 karmasphere 远程部署时,我只能看到 HDFS 中的可用文件,但我无法访问这些文件,也无法运行 map reduce 程序,也无法从我的 eclipse IDE 在 HDFS 中创建新文件。我得到以下异常:

java.io.IOException: Blocklist for /user/cloudera/wordcount/input/wordcount/file01 has changed!
java.io.IOException: Blocklist for /user/cloudera/wordcount/input/wordcount/file01 has changed!
    at com.karmasphere.studio.hadoop.client.hdfs.vfsio.DFSInputStream.openInfo(DFSInputStream.java:81)
    at com.karmasphere.studio.hadoop.client.hdfs.vfsio.DFSInputStream.chooseDataNode(DFSInputStream.java:357)
    at com.karmasphere.studio.hadoop.client.hdfs.vfsio.DFSInputStream.blockSeekTo(DFSInputStream.java:206)
    at com.karmasphere.studio.hadoop.client.hdfs.vfsio.DFSInputStream.read(DFSInputStream.java:311)
    at java.io.BufferedInputStream.fill(Unknown Source)
    at java.io.BufferedInputStream.read1(Unknown Source)
    at java.io.BufferedInputStream.read(Unknown Source)
    at org.apache.commons.vfs.util.MonitorInputStream.read(MonitorInputStream.java:74)
    at java.io.FilterInputStream.read(Unknown Source)
    at com.karmasphere.studio.hadoop.mapreduce.model.hadoop.HadoopBootstrapModel.createCacheFile(HadoopBootstrapModel.java:198)
    at com.karmasphere.studio.hadoop.mapreduce.model.hadoop.HadoopBootstrapModel.update(HadoopBootstrapModel.java:169)
    at com.karmasphere.studio.hadoop.mapreduce.model.core.AbstractOperatorModel.run(AbstractOperatorModel.java:369)
    at org.openide.util.RequestProcessor$Task.run(RequestProcessor.java:577)
    at org.openide.util.RequestProcessor$Processor.run(RequestProcessor.java:1030)

有人可以帮我解决这个问题吗?我是 karmasphere 和 Hadoop 的新手。

4

0 回答 0