14

当我尝试在主节点上启动 hadoop 时,我得到以下输出。并且 namenode 没有启动。

[hduser@dellnode1 ~]$ start-dfs.sh
starting namenode, logging to /usr/local/hadoop/bin/../logs/hadoop-hduser-namenode-dellnode1.library.out
dellnode1.library: datanode running as process 5123. Stop it first.
dellnode3.library: datanode running as process 4072. Stop it first.
dellnode2.library: datanode running as process 4670. Stop it first.
dellnode1.library: secondarynamenode running as process 5234. Stop it first.
[hduser@dellnode1 ~]$ jps
5696 Jps
5123 DataNode
5234 SecondaryNameNode
4

4 回答 4

25

“先停下来”。

  • 首先调用 stop-all.sh

  • 输入 jps

  • 调用 start-all.sh(或 start-dfs.sh 和 start-mapred.sh)

  • 输入 jps(如果 namenode 没有出现,输入“hadoop namenode”并检查错误)

于 2013-08-17T08:45:33.300 回答
7

根据在较新版本的 hardoop 上运行“stop-all.sh”,这已被弃用。您应该改用:

stop-dfs.sh

stop-yarn.sh

于 2015-11-16T17:22:21.987 回答
1

今天,在执行 pig 脚本时,我遇到了问题中提到的相同错误:

starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-namenode-localhost.localdomain.out
localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
localhost: Warning: $HADOOP_HOME is deprecated.
localhost: 
localhost: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-datanode-localhost.localdomain.out
localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
localhost: Warning: $HADOOP_HOME is deprecated.
localhost: 
localhost: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-secondarynamenode-localhost.localdomain.out
starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-jobtracker-localhost.localdomain.out
localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
localhost: Warning: $HADOOP_HOME is deprecated.
localhost: 
localhost: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-tasktracker-localhost.localdomain.out

所以,答案是:

[training@localhost bin]$ stop-all.sh

然后输入:

[training@localhost bin]$ start-all.sh

问题将得到解决。现在您可以使用 mapreduce 运行 pig 脚本了!

于 2015-11-18T00:24:47.997 回答
0

在 Mac 中(如果您使用自制软件安装)其中 3.0.0 是 Hadoop 版本。在 Linux 中相应地更改安装路径(只有这部分会更改 . /usr/local/Cellar/)。

> /usr/local/Cellar/hadoop/3.0.0/sbin/stopyarn.sh
> /usr/local/Cellar/hadoop/3.0.0/sbin/stopdfs.sh
> /usr/local/Cellar/hadoop/3.0.0/sbin/stop-all.sh"

alias更适合专业用户在你的末尾写这个~/.bashrc~/.zshrc(如果你是 zsh 用户)。每次您想停止 Hadoop 和所有相关进程时,只需hstop从命令行键入即可。

alias hstop="/usr/local/Cellar/hadoop/3.0.0/sbin/stop-yarn.sh;/usr/local/Cellar/hadoop/3.0.0/sbin/stop-dfs.sh;/usr/local/Cellar/hadoop/3.0.0/sbin/stop-all.sh"
于 2018-04-21T23:08:36.677 回答