我对 Hadoop 很陌生。我使用以下命令启动 Hadoop...
[gpadmin@BigData1-ahandler root]$ /usr/local/hadoop-0.20.1/bin/start-all.sh
starting namenode, logging to /usr/local/hadoop-0.20.1/logs/hadoop-gpadmin-namenode-BigData1-ahandler.out
localhost: starting datanode, logging to /usr/local/hadoop-0.20.1/logs/hadoop-gpadmin-datanode-BigData1-ahandler.out
localhost: starting secondarynamenode, logging to /usr/local/hadoop-0.20.1/logs/hadoop-gpadmin-secondarynamenode-BigData1-ahandler.out
starting jobtracker, logging to /usr/local/hadoop-0.20.1/logs/hadoop-gpadmin-jobtracker-BigData1-ahandler.out
localhost: starting tasktracker, logging to /usr/local/hadoop-0.20.1/logs/hadoop-gpadmin-tasktracker-BigData1-ahandler.out
当我尝试 -cat 来自以下目录的输出时,出现错误:“没有可用的节点”。这个错误是什么意思?我该如何解决?还是开始调试?
[gpadmin@BigData1-ahandler root]$ hadoop fs -cat output/d*/part-*
13/11/13 15:33:09 INFO hdfs.DFSClient: No node available for block: blk_-5883966349607013512_1099 file=/user/gpadmin/output/d15795/part-00000
13/11/13 15:33:09 INFO hdfs.DFSClient: Could not obtain block blk_-5883966349607013512_1099 from any node: java.io.IOException: No live nodes contain current block