问题:Eclipse Hadoop 插件问题(本地异常调用 localhost/127.0.0.1:50070 失败:java.io.EOFException)。任何机构都可以给我解决方案吗?
我正在学习 Cloudera 培训教程。其中使用 Eclipse(Helios)3.6 和 Hadoop.0.20.2-cdh3u2 版本。
我已经下载了 hadoop-eclipse-plugin-0.20.3-SNAPSHOT.jar 并将其复制到 /home/training/eclipse/plugins/ 文件夹中。
Run --> Eclipse --> gone to File (which is in Menu bar) --> New --> other
从其他选择了 MapReduce 项目。我选择了指定 Hadoop 库位置。并将位置指定为“/usr/lib/hadoop”。在这个位置,我有以下文件。
bin hadoop-examples-0.20.2-cdh3u2.jar
build.xml hadoop-examples.jar
CHANGES.txt hadoop-test-0.20.2-cdh3u2.jar
conf hadoop-test.jar
contrib hadoop-tools-0.20.2-cdh3u2.jar
example-confs hadoop-tools.jar
hadoop-0.20.2-cdh3u2-ant.jar ivy
hadoop-0.20.2-cdh3u2-core.jar ivy.xml
hadoop-0.20.2-cdh3u2-examples.jar lib
hadoop-0.20.2-cdh3u2-test.jar LICENSE.txt
hadoop-0.20.2-cdh3u2-tools.jar logs
hadoop-ant-0.20.2-cdh3u2.jar NOTICE.txt
hadoop-ant.jar pids
hadoop-core-0.20.2-cdh3u2.jar README.txt
hadoop-core.jar webapps
并将 Mpareduce 项目指定为“myhadoop”并单击完成按钮。我在 DFS Locations 按钮上得到了 Mapreduce 按钮,但没有得到它的雇佣关系。
去检查我的 dfs 和 mapred 端口。
我的 core-site.xml 是
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:8020</value>
</property>
我的 mapred-site.xml
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:8021</value>
</property>
在 Map ReTo Define Hadoop Location in eclipse 中,我给出了如下。
Map/Reduce Master
Host: localhost
port 50021
DFS Master:
Host :localhost
Port:50020
同时我选择使用 M/R 主机。
我已经运行了 cloudera 的示例 wordcount 程序,但它给了我以下问题。请给我解决方案,我从 2 天开始尝试。,....
Exception in thread "main" java.io.IOException: Call to localhost/127.0.0.1:50070 failed on local exception: java.io.EOFException
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
at org.apache.hadoop.ipc.Client.call(Client.java:1110)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
at $Proxy0.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:111)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:212)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:183)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:368)
at WordCount.main(WordCount.java:65)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:375)
at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:815)
at org.apache.hadoop.ipc.Client$Connection.run(Client.java:724)