0

我已经在 ubuntu 中安装了 hadoop,并且运行良好。

ubuntu:/home/hduser/hive-0.10.0-cdh4.3.1$ jps
2702 DataNode
3101 ResourceManager
4879 Jps
2948 SecondaryNameNode
3306 NodeManager

hadoop_version=Hadoop 2.0.0-cdh4.3.0

然后我从 apache tarballs 安装了 hive(hiv version-hive-0.10.0) 并尝试运行 bin/hive。但我得到以下错误:

无法确定 Hadoop 版本信息。 hadoop version回来:

/home/hduser/hadoop/etc/hadoop /usr/lib/jvm/jdk1.6.0_45/ 
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a 
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013 
From source with checksum a4218d77f9b12df4e3e49ef96f9d357d 
This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar

我尝试通过我的脚本知识来解决它,但不能。当我努力工作时,我发现它在以下行中失败了:

if [[ "$HADOOP_VERSION" =~ $hadoop_version_re ]]; then

我试图echo HADOOP_VERSION让它什么也没返回,HADOOP_VERSION 被定义为

HADOOP_VERSION=$($HADOOP version | awk '{if (NR == 1) {print $2;}}');

$HADOOP version让我

 /home/hduser/hadoop/etc/hadoop
 /usr/lib/jvm/jdk1.6.0_45/
 Hadoop 2.0.0-cdh4.3.0
 Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
 Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
 From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
 This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar

我已经被这个打击了一个星期了。请帮帮我。谢谢。

4

6 回答 6

1

你的问题已经描述了这个问题。当脚本执行时$HADOOP version,它期望输出如下:

Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
 From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
 This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar`

取而代之的是其他一些输出(可能是因为您修改了 Hadoop 中的一些脚本。检查 conf/hadoop-env.sh):

/home/hduser/hadoop/etc/hadoop
/usr/lib/jvm/jdk1.6.0_45/
Hadoop 2.0.0-cdh4.3.0
Subversion file:///var/lib/jenkins/workspace/CDH4.3.0-Packaging-Hadoop/build/cdh4/hadoop/2.0.0-cdh4.3.0/source/hadoop-common-project/hadoop-common -r 48a9315b342ca16de92fcc5be95ae3650629155a
Compiled by jenkins on Mon May 27 19:06:57 PDT 2013
 From source with checksum a4218d77f9b12df4e3e49ef96f9d357d
 This command was run using /home/hduser/hadoop/share/hadoop/common/hadoop-common-2.0.0-cdh4.3.0.jar`

现在 awk 行不再找到所需的数字(在位置 2 上)。

所以解决方案是找出额外输出的来源并将其删除。

于 2014-02-20T17:35:55.693 回答
1

我有同样的问题,我通过在 .profile 中包含以下内容并再次采购它来修复它。

导出 HADOOP_VERSION="2.0.0-cdh4.2.0"

于 2014-04-18T15:29:49.403 回答
0

在 Windows 上,您可能会遇到同样的问题。

实际上,如果$HADOOP_HOME设置为dos 路径(例如C:\hadoop:),则需要在cygwin 中进行更改。一种方法是将以下行放在您的 .bashrc 中:

export HADOOP_HOME="$(cygpath $HADOOP_HOME)"
于 2014-07-01T10:10:40.490 回答
0

检查您的 JAR 路径 ( JRE_HOME)

于 2016-10-13T06:12:38.980 回答
0

执行以下命令:hadoop 版本

hduser@ubuntu:/usr/local/hadoop/sbin$ hadoop version

于 2016-03-02T17:13:16.590 回答
0

如果你export HADOOP_VERSION=2.0.0-cdh4.3.0在 .bashrc 文件中设置或你的版本号,注释它或在前面加上一个#,#export HADOOP_VERSION=2.0.0-cdh4.3.0然后运行 ​​hive,你将能够解决问题。

于 2017-11-17T10:16:09.703 回答