1

在 Eclipse 6.91 中运行 Hadoop 0.20.2 M/R 应用程序。

执行后我收到这些错误和警告:

13/07/24 16:52:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with      processName=JobTracker, sessionId=
13/07/24 16:52:52 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/07/24 16:52:52 WARN mapred.JobClient: No job jar file set.  User classes may not be found. See JobConf(Class) or JobConf#setJar(String).
13/07/24 16:52:52 INFO input.FileInputFormat: Total input paths to process : 1
13/07/24 16:52:54 INFO mapred.JobClient: Running job: job_local_0001
13/07/24 16:52:54 INFO input.FileInputFormat: Total input paths to process : 1
13/07/24 16:52:54 INFO mapred.MapTask: io.sort.mb = 100
13/07/24 16:52:54 INFO mapred.MapTask: data buffer = 79691776/99614720
13/07/24 16:52:54 INFO mapred.MapTask: record buffer = 262144/327680
13/07/24 16:52:55 INFO mapred.MapTask: Starting flush of map output
13/07/24 16:52:55 INFO mapred.MapTask: Finished spill 0
13/07/24 16:52:55 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
13/07/24 16:52:55 INFO mapred.LocalJobRunner: 
13/07/24 16:52:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
13/07/24 16:52:55 INFO mapred.LocalJobRunner: 
13/07/24 16:52:55 INFO mapred.Merger: Merging 1 sorted segments
13/07/24 16:52:55 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left  of total size: 7204 bytes
13/07/24 16:52:55 INFO mapred.LocalJobRunner: 
13/07/24 16:52:55 WARN mapred.LocalJobRunner: job_local_0001
java.io.EOFException
at java.io.DataInputStream.readFully(DataInputStream.java:197)
at java.io.DataInputStream.readLong(DataInputStream.java:416)
at java.io.DataInputStream.readDouble(DataInputStream.java:468)
at Continents$CountryPropertiesWritable.readFields(Continents.java:62)
at   org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:67)
at org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:40)
at org.apache.hadoop.mapreduce.ReduceContext.nextKeyValue(ReduceContext.java:116)
at org.apache.hadoop.mapreduce.ReduceContext.nextKey(ReduceContext.java:92)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:175)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:566)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:408)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:216)
13/07/24 16:52:55 INFO mapred.JobClient:  map 100% reduce 0%
13/07/24 16:52:55 INFO mapred.JobClient: Job complete: job_local_0001
13/07/24 16:52:55 INFO mapred.JobClient: Counters: 13
13/07/24 16:52:55 INFO mapred.JobClient:   FileSystemCounters
13/07/24 16:52:55 INFO mapred.JobClient:     HDFS_BYTES_READ=29783
13/07/24 16:52:55 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=24461
13/07/24 16:52:55 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=17085
13/07/24 16:52:55 INFO mapred.JobClient:   Map-Reduce Framework
13/07/24 16:52:55 INFO mapred.JobClient:     Reduce input groups=0
13/07/24 16:52:55 INFO mapred.JobClient:     Combine output records=0
13/07/24 16:52:55 INFO mapred.JobClient:     Map input records=252
13/07/24 16:52:55 INFO mapred.JobClient:     Reduce shuffle bytes=0
13/07/24 16:52:55 INFO mapred.JobClient:     Reduce output records=0
13/07/24 16:52:55 INFO mapred.JobClient:     Spilled Records=251
13/07/24 16:52:55 INFO mapred.JobClient:     Map output bytes=6700
13/07/24 16:52:55 INFO mapred.JobClient:     Combine input records=0
13/07/24 16:52:55 INFO mapred.JobClient:     Map output records=251
13/07/24 16:52:55 INFO mapred.JobClient:     Reduce input records=0
13/07/24 16:52:55 ERROR hdfs.DFSClient: Exception closing file /user/ww-   pc/cyg_server/exitHouseProperties1/_temporary/_attempt_local_0001_r_000000_0/part-r-00000 :   org.apache.hadoop.ipc.RemoteException: java.io.IOException: Could not complete write to   file /user/ww-  pc/cyg_server/exitHouseProperties1/_temporary/_attempt_local_0001_r_000000_0/part-r-00000   by DFSClient_1179141130
at org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449)
at sun.reflect.GeneratedMethodAccessor13.invoke(Unknown Source)
at   sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

org.apache.hadoop.ipc.RemoteException: java.io.IOException: Could not complete write to file /user/ww-pc/cyg_server/exitHouseProperties1/_temporary/_attempt_local_0001_r_000000_0/part-r-00000 by DFSClient_1179141130
at org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449)
at sun.reflect.GeneratedMethodAccessor13.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)

at org.apache.hadoop.ipc.Client.call(Client.java:740)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
at com.sun.proxy.$Proxy0.complete(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
at com.sun.proxy.$Proxy0.complete(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.closeInternal(DFSClient.java:3264)
at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.close(DFSClient.java:3188)
at org.apache.hadoop.hdfs.DFSClient$LeaseChecker.close(DFSClient.java:1043)
at org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:237)
at org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:269)
at org.apache.hadoop.fs.FileSystem$Cache.closeAll(FileSystem.java:1424)
at org.apache.hadoop.fs.FileSystem.closeAll(FileSystem.java:217)
at org.apache.hadoop.fs.FileSystem$ClientFinalizer.run(FileSystem.java:202)

我认为WARN mapred.LocalJobRunner: job_local_0001 java.io.EOFException是错误的原因。我应该在代码中更改什么?键和值的输入和输出类型是否可能存在错误?

相关代码:

public static class Map extends Mapper<LongWritable, Text, Text, HousePropertiesWritable> {
         private Text housekey = new Text();

   @Override
   public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
      ......
      ......
      context.write(housekey, new HousePropertiesWritable(...variables..)
   }
}

public static class Reduce extends Reducer<Text, HousePropertiesWritable, Text, Text> {
  @Override
  public void reduce(Text key, Iterable<HousePropertiesWritable> values, Context context)
              throws IOException, InterruptedException {
       ....
       ....
       context.write(key, new Text(output));
  }
}

相关代码来自Main,配置Job

    Job job = new Job(conf, "HouseCalculation");
    job.setJarByClass(House.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(HousePropertiesWritable.class);

    job.setMapperClass(Map.class);
    job.setReducerClass(Reduce.class);

    job.setInputFormatClass(TextInputFormat.class);
    job.setOutputFormatClass(TextOutputFormat.class);
4

1 回答 1

2

感谢 Chris White,我在 HousePropertiesWritable 类中的 readFields 和 write 方法中发现了错误。

我将 in.readLine() 和 out.writeBytes() 用于字符串变量。现在我更改 in.readUTF() 和 out.writeUTF() 并且一切正常。

再次感谢,

干杯

于 2013-07-27T18:14:30.347 回答