13

mapred-site.xml 的内容:

<configuration>
<property>
 <name>mapreduce.framework.name</name>
 <value>yarn</value>
</property>

<property>
 <name>yarn.app.mapreduce.am.env</name>
 <value>HADOOP_MAPRED_HOME=/home/admin/hadoop-3.1.0</value>
</property>

<property>
 <name>mapreduce.map.env</name>
 <value>HADOOP_MAPRED_HOME=/home/admin/hadoop-3.1.0</value>
</property>

<property>
 <name>mapreduce.reduce.env</name>
 <value>HADOOP_MAPRED_HOME=/home/admin/hadoop-3.1.0</value>
</property>

<property> 
    <name>mapreduce.application.classpath</name>
    <value>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*,$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*</value>
</property>

</configuration>

虽然我已经设置了 yarn.app.mapreduce.am.env 和其他参数;我收到无法找到或加载主类 org.apache.hadoop.mapreduce.v2.app.MRAppMaster 错误。我正在尝试远程运行 map reduce 程序,其中 hadoop 安装在 linux 机器上,我正在从 windows 机器上运行它。以下是我的工作配置设置。

public class WordCount {
  public static void main(String[] args)
      throws IOException, ClassNotFoundException, InterruptedException {
    //
    UserGroupInformation ugi = UserGroupInformation.createRemoteUser("admin");
    ugi.doAs(new PrivilegedExceptionAction<Void>() {

      public Void run() throws Exception {
        try {
          Configuration configuration = new Configuration();

          configuration.set("yarn.resourcemanager.address", "192.168.33.75:50001"); // see step 3
          configuration.set("mapreduce.framework.name", "yarn");
          // configuration.set("yarn.app.mapreduce.am.env",
          // "HADOOP_MAPRED_HOME=/home/admin/hadoop-3.1.0");
          // configuration.set("mapreduce.map.env", "HADOOP_MAPRED_HOME=/home/admin/hadoop-3.1.0");
          // configuration.set("mapreduce.reduce.env",
          // "HADOOP_MAPRED_HOME=/home/admin/hadoop-3.1.0");
          configuration.set("fs.defaultFS", "hdfs://192.168.33.75:54310"); // see step 2
          configuration.set("mapreduce.app-submission.cross-platform", "true");
          configuration.set("mapred.remote.os", "Linux");
          configuration.set("yarn.application.classpath",
              "$HADOOP_CONF_DIR:$HADOOP_COMMON_HOME/*:$HADOOP_COMMON_HOME/lib/*:"
                  + "$HADOOP_HDFS_HOME/*:$HADOOP_HDFS_HOME/lib/*:"
                  + "$HADOOP_YARN_HOME/*:$HADOOP_YARN_HOME/lib/*:"
                  + "$HADOOP_MAPRED_HOME/*:$HADOOP_MAPRED_HOME/lib/*");

          Job job = Job.getInstance(configuration);

          job.setJarByClass(WordCount.class); // use this when uploaded the Jar to the server and
                                              // running the job directly and locally on the server
          job.setOutputKeyClass(Text.class);
          job.setOutputValueClass(IntWritable.class);
          job.setMapperClass(MapForWordCount.class);
          job.setReducerClass(ReduceForWordCount.class);

          Path input = new Path("/user/admin/wordCountInput.txt");
          Path output = new Path("/user/admin/output");
          FileInputFormat.addInputPath(job, input);
          FileOutputFormat.setOutputPath(job, output);
          System.exit(job.waitForCompletion(true) ? 0 : 1);
        } catch (Exception e) {
          e.printStackTrace();
        }
        return null;
      }

    });


  }

请帮我。从过去 6 天开始,我一直被这个问题困扰。提前非常感谢。Hadoop版本:3.1.0

4

4 回答 4

14

我有同样的问题,并通过在mapred-site.xml中添加它来解决(所以编辑你的mapreduce.application.classpath属性)

<property> 
    <name>mapreduce.application.classpath</name>
    <value>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*,$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,$HADOOP_MAPRED_HOME/share/hadoop/common/*,$HADOOP_MAPRED_HOME/share/hadoop/common/lib/*,$HADOOP_MAPRED_HOME/share/hadoop/yarn/*,$HADOOP_MAPRED_HOME/share/hadoop/yarn/lib/*,$HADOOP_MAPRED_HOME/share/hadoop/hdfs/*,$HADOOP_MAPRED_HOME/share/hadoop/hdfs/lib/*</value>
</property>
于 2018-06-21T09:02:11.347 回答
12

只需编辑mapred-site.xml文件:

添加以下属性:

  1. <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property>

  2. <property> <name>yarn.app.mapreduce.am.env</name> <value>HADOOP_MAPRED_HOME=/Users/adityaatri/Applications/hadoop-3.1.3</value> </property>

  3. <property> <name>mapreduce.map.env</name> <value>HADOOP_MAPRED_HOME=/Users/adityaatri/Applications/hadoop-3.1.3</value> </property>

  4. <property> <name>mapreduce.reduce.env</name> <value>HADOOP_MAPRED_HOME=/Users/adityaatri/Applications/hadoop-3.1.3</value> </property>

现在对于所有上述 4 个属性,将路径替换为/Users/adityaatri/Applications/hadoop-3.1.3您的 Hadoop 家庭地址。

现在添加第 5 个属性:

  1. <property> <name>mapreduce.application.classpath</name> <value></value> </property>

在终端中执行以下命令后,该<value>元素必须填充内容:

  1. export HADOOP_CLASSPATH=$(hadoop classpath)

  2. echo $HADOOP_CLASSPATH

我的终端的输出:

/Users/adityaatri/Applications/hadoop-3.1.3/etc/hadoop:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/common/lib/*:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/common/*:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/hdfs:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/hdfs/lib/*:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/hdfs/*:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/mapreduce/lib/*:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/mapreduce/*:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/yarn:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/yarn/lib/*:/Users/adityaatri/Applications/hadoop-3.1.3/share/hadoop/yarn/*

将此内容复制<value>到第 5 个属性的元素内。

现在你不会得到任何错误。:)

于 2020-04-12T12:15:02.537 回答
4

我在 yarn-site.xml 中添加了以下属性

<property>
  <name>yarn.application.classpath</name>
  <value> $HADOOP_CONF_DIR,$HADOOP_COMMON_HOME/share/hadoop/common/*,$HADOOP_COMMON_HOME/share/hadoop/common/lib/*,$HADOOP_HDFS_HOME/share/hadoop/hdfs/*,$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/*,$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/*,$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/*,
    $HADOOP_YARN_HOME/share/hadoop/yarn/*,$HADOOP_YARN_HOME/share/hadoop/yarn/lib/* 
  </value>
</property>

并且在我的 map reduce 程序中做了同样的改变。

  configuration.set("yarn.application.classpath",
              "{{HADOOP_CONF_DIR}},{{HADOOP_COMMON_HOME}}/share/hadoop/common/*,{{HADOOP_COMMON_HOME}}/share/hadoop/common/lib/*,"
                  + " {{HADOOP_HDFS_HOME}}/share/hadoop/hdfs/*,{{HADOOP_HDFS_HOME}}/share/hadoop/hdfs/lib/*,"
                  + "{{HADOOP_MAPRED_HOME}}/share/hadoop/mapreduce/*,{{HADOOP_MAPRED_HOME}}/share/hadoop/mapreduce/lib/*,"
                  + "{{HADOOP_YARN_HOME}}/share/hadoop/yarn/*,{{HADOOP_YARN_HOME}}/share/hadoop/yarn/lib/*");

我的程序现在运行顺利。随时问我细节。

于 2018-06-21T12:56:42.097 回答
0

问题是您的资源管理器(纱线)无法加载 Hadoop 库(罐子)。我通过更新配置解决了这个问题。将此添加到 yarn-site.xml :

<property>
<name>yarn.application.classpath</name>
<value>C:/hadoop-2.8.0/share/hadoop/mapreduce/*,C:/hadoop-2.8.0/share/hadoop/mapreduce/lib/*,C:/Hadoop-2.8.0/share/hadoop/common/*,C:/Hadoop-2.8.0/share/hadoop/common/lib/*,
    C:/hadoop-2.8.0/share/hadoop/hdfs/*,C:/hadoop-2.8.0/share/hadoop/hdfs/lib/*,C:/hadoop-2.8.0/share/hadoop/yarn/*,C:/hadoop-2.8.0/share/hadoop/yarn/lib/*</value>
</property>

请注意,此处使用的路径可以根据您的系统是相对的。

于 2018-12-01T15:53:57.463 回答