6

I tried to install Hadoop on a single node cluster (my own labtop-ubuntu 12.04). I followed this tutorial and checked it line by line two times . http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

everything seems correct. I set all the core-site.xml ,mapred-site.xml ,hdfs-site.xml .

when I run the following command in hduser su :

hduser@maziyar-Lenovo-IdeaPad-U300s:~$ /usr/local/hadoop/usr/sbin/start-all.sh

I get the following errors :

Warning: $HADOOP_HOME is deprecated.

starting namenode, logging to /usr/local/hadoop/usr/libexec/../logs/hadoop-hduser-namenode-maziyar-Lenovo-IdeaPad-U300s.out
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/slaves: No such file or directory
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/masters: No such file or directory
starting jobtracker, logging to /usr/local/hadoop/usr/libexec/../logs/hadoop-hduser-jobtracker-maziyar-Lenovo-IdeaPad-U300s.out
cat: /usr/local/hadoop/usr/libexec/../etc/hadoop/slaves: No such file or directory

I added the export HADOOP_HOME_WARN_SUPPRESS="TRUE" into hadoop-env.sh and still same error.

On the file /home/hduser/.bashrc where I guess my error comming from I have :

# Set Hadoop-related environment variables
export HADOOP_HOME=/usr/local/hadoop

# Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)
export JAVA_HOME=/usr/lib/jvm/jdk-7u10-linuxi586/usr/java/jdk1.7.0_10

# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.
#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/usr/sbin

I added /usr/sbin as a bin directory because start-all.sh and commands are there.

I also tried "HADOOP_PREFIX" instead of "HADOOP_HOME" in bashrc file but still the same error.

I have this folders in my hadoop directory ,

maziyar@maziyar-Lenovo-IdeaPad-U300s:/usr/local/hadoop$ ls -lha
total 20K
drwxr-xr-x  5 hduser hadoop 4.0K May 30 15:25 .
drwxr-xr-x 12 root   root   4.0K May 30 15:25 ..
drwxr-xr-x  4 hduser hadoop 4.0K May 30 15:25 etc
drwxr-xr-x 12 hduser hadoop 4.0K Jun  4 21:29 usr
drwxr-xr-x  4 hduser hadoop 4.0K May 30 15:25 var

I downloaded the latest version of apache-hadoop last week: hadoop-1.1.2-1.i386.rpm

4

5 回答 5

6

我尝试export HADOOP_HOME_WARN_SUPPRESS="TRUE"在我的conf/hadoop-env.sh文件中设置,警告消失了。虽然,我仍然不确定为什么这个警告会排在第一位。

于 2013-06-13T06:19:23.817 回答
3

在中替换HADOOP_HOME为我解决了这个问题。HADOOP_PREFIX~/.bashrc

您是否在进行此更改后尝试注销当前会话并再次尝试?您对 bash 配置文件所做的更改将在您再次登录 shell 时生效。

于 2013-10-11T09:33:31.533 回答
2

您的 bash 会话可能仍然HADOOP_HOME定义了变量。试着echo $HADOOP_HOME看看你是否得到任何价值。

如果你得到它的值HADOOP_HOME意味着 shell 从一些配置文件中获取它,检查这些文件(~/.bashrc~/.profile/etc/profile//etc/bash.bashrc等)并删除导出的HADOOP_HOME变量。

在您设置HADOOP_PREFIX环境变量而不是HADOOP_HOMEin之后打开一个新会话,~/.bashrc并且您确定$HADOOP_HOME没有在任何配置文件中导出,并且您不应该看到该警告消息。

于 2013-07-01T12:01:55.273 回答
1

不推荐使用的错误意味着您正在使用的特定版本被认为不重要,或者一旦在网站上阅读它将停止支持。

我的意思是说你已经在 hadoop 中安装了 Openjdk。我所做的不是安装 openjdk,而是安装了 Oraclejdk。也许你应该尝试这样做。

让我知道这是否有帮助。

问候。

于 2016-03-05T03:53:44.400 回答
0

$HADOOP_HOMEis deprecated 是一条警告消息。今天我在 Ubuntu 上使用 Micheal 网站说明安装,它运行良好。

于 2013-08-25T05:30:21.973 回答