今天,在执行 pig 脚本时,我遇到了问题中提到的相同错误:
starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-namenode-localhost.localdomain.out
localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
localhost: Warning: $HADOOP_HOME is deprecated.
localhost:
localhost: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-datanode-localhost.localdomain.out
localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
localhost: Warning: $HADOOP_HOME is deprecated.
localhost:
localhost: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-secondarynamenode-localhost.localdomain.out
starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-jobtracker-localhost.localdomain.out
localhost: /home/training/.bashrc: line 10: /jdk1.7.0_10/bin: No such file or directory
localhost: Warning: $HADOOP_HOME is deprecated.
localhost:
localhost: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-training-tasktracker-localhost.localdomain.out
所以,答案是:
[training@localhost bin]$ stop-all.sh
然后输入:
[training@localhost bin]$ start-all.sh
问题将得到解决。现在您可以使用 mapreduce 运行 pig 脚本了!